Dec 17 09:04:39 crc systemd[1]: Starting Kubernetes Kubelet... Dec 17 09:04:39 crc restorecon[4681]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:39 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:40 crc restorecon[4681]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 17 09:04:40 crc restorecon[4681]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 17 09:04:40 crc kubenswrapper[4935]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 17 09:04:40 crc kubenswrapper[4935]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 17 09:04:40 crc kubenswrapper[4935]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 17 09:04:40 crc kubenswrapper[4935]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 17 09:04:40 crc kubenswrapper[4935]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 17 09:04:40 crc kubenswrapper[4935]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.874532 4935 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880153 4935 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880191 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880202 4935 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880215 4935 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880229 4935 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880252 4935 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880264 4935 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880303 4935 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880313 4935 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880323 4935 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880333 4935 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880343 4935 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880352 4935 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880360 4935 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880369 4935 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880378 4935 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880386 4935 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880396 4935 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880405 4935 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880415 4935 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880424 4935 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880434 4935 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880442 4935 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880452 4935 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880463 4935 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880473 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880482 4935 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880491 4935 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880500 4935 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880510 4935 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880519 4935 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880528 4935 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880541 4935 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880551 4935 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880560 4935 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880570 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880579 4935 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880588 4935 feature_gate.go:330] unrecognized feature gate: Example Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880597 4935 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880606 4935 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880615 4935 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880626 4935 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880635 4935 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880644 4935 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880653 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880663 4935 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880674 4935 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880684 4935 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880693 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880702 4935 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880711 4935 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880720 4935 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880729 4935 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880738 4935 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880746 4935 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880755 4935 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880763 4935 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880772 4935 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880781 4935 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880789 4935 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880798 4935 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880806 4935 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880821 4935 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880829 4935 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880838 4935 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880846 4935 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880854 4935 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880864 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880872 4935 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880881 4935 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.880889 4935 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881090 4935 flags.go:64] FLAG: --address="0.0.0.0" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881110 4935 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881128 4935 flags.go:64] FLAG: --anonymous-auth="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881141 4935 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881175 4935 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881185 4935 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881198 4935 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881211 4935 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881221 4935 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881231 4935 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881242 4935 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881252 4935 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881263 4935 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881299 4935 flags.go:64] FLAG: --cgroup-root="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881309 4935 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881319 4935 flags.go:64] FLAG: --client-ca-file="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881329 4935 flags.go:64] FLAG: --cloud-config="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881339 4935 flags.go:64] FLAG: --cloud-provider="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881350 4935 flags.go:64] FLAG: --cluster-dns="[]" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881362 4935 flags.go:64] FLAG: --cluster-domain="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881371 4935 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881382 4935 flags.go:64] FLAG: --config-dir="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881393 4935 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881403 4935 flags.go:64] FLAG: --container-log-max-files="5" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881416 4935 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881426 4935 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881436 4935 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881447 4935 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881458 4935 flags.go:64] FLAG: --contention-profiling="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881469 4935 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881478 4935 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881489 4935 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881499 4935 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881511 4935 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881521 4935 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881532 4935 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881541 4935 flags.go:64] FLAG: --enable-load-reader="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881551 4935 flags.go:64] FLAG: --enable-server="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881561 4935 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881574 4935 flags.go:64] FLAG: --event-burst="100" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881585 4935 flags.go:64] FLAG: --event-qps="50" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881595 4935 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881607 4935 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881619 4935 flags.go:64] FLAG: --eviction-hard="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881632 4935 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881642 4935 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881654 4935 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881664 4935 flags.go:64] FLAG: --eviction-soft="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881674 4935 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881684 4935 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881693 4935 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881703 4935 flags.go:64] FLAG: --experimental-mounter-path="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881713 4935 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881722 4935 flags.go:64] FLAG: --fail-swap-on="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881733 4935 flags.go:64] FLAG: --feature-gates="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881745 4935 flags.go:64] FLAG: --file-check-frequency="20s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881755 4935 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881766 4935 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881776 4935 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881786 4935 flags.go:64] FLAG: --healthz-port="10248" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881796 4935 flags.go:64] FLAG: --help="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881807 4935 flags.go:64] FLAG: --hostname-override="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881817 4935 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881827 4935 flags.go:64] FLAG: --http-check-frequency="20s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881838 4935 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881847 4935 flags.go:64] FLAG: --image-credential-provider-config="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881857 4935 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881867 4935 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881877 4935 flags.go:64] FLAG: --image-service-endpoint="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881887 4935 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881896 4935 flags.go:64] FLAG: --kube-api-burst="100" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881906 4935 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881916 4935 flags.go:64] FLAG: --kube-api-qps="50" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881926 4935 flags.go:64] FLAG: --kube-reserved="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881937 4935 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881946 4935 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881956 4935 flags.go:64] FLAG: --kubelet-cgroups="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881966 4935 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881977 4935 flags.go:64] FLAG: --lock-file="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881986 4935 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.881997 4935 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882007 4935 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882022 4935 flags.go:64] FLAG: --log-json-split-stream="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882031 4935 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882041 4935 flags.go:64] FLAG: --log-text-split-stream="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882052 4935 flags.go:64] FLAG: --logging-format="text" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882062 4935 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882074 4935 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882084 4935 flags.go:64] FLAG: --manifest-url="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882094 4935 flags.go:64] FLAG: --manifest-url-header="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882108 4935 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882118 4935 flags.go:64] FLAG: --max-open-files="1000000" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882130 4935 flags.go:64] FLAG: --max-pods="110" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882142 4935 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882152 4935 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882163 4935 flags.go:64] FLAG: --memory-manager-policy="None" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882173 4935 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882183 4935 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882193 4935 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882203 4935 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882223 4935 flags.go:64] FLAG: --node-status-max-images="50" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882234 4935 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882245 4935 flags.go:64] FLAG: --oom-score-adj="-999" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882255 4935 flags.go:64] FLAG: --pod-cidr="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882265 4935 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882306 4935 flags.go:64] FLAG: --pod-manifest-path="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882316 4935 flags.go:64] FLAG: --pod-max-pids="-1" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882326 4935 flags.go:64] FLAG: --pods-per-core="0" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882336 4935 flags.go:64] FLAG: --port="10250" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882346 4935 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882356 4935 flags.go:64] FLAG: --provider-id="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882366 4935 flags.go:64] FLAG: --qos-reserved="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882377 4935 flags.go:64] FLAG: --read-only-port="10255" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882386 4935 flags.go:64] FLAG: --register-node="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882397 4935 flags.go:64] FLAG: --register-schedulable="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882407 4935 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882424 4935 flags.go:64] FLAG: --registry-burst="10" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882433 4935 flags.go:64] FLAG: --registry-qps="5" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882445 4935 flags.go:64] FLAG: --reserved-cpus="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882454 4935 flags.go:64] FLAG: --reserved-memory="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882466 4935 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882478 4935 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882488 4935 flags.go:64] FLAG: --rotate-certificates="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882497 4935 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882507 4935 flags.go:64] FLAG: --runonce="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882517 4935 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882528 4935 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882538 4935 flags.go:64] FLAG: --seccomp-default="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882548 4935 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882558 4935 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882568 4935 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882579 4935 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882589 4935 flags.go:64] FLAG: --storage-driver-password="root" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882599 4935 flags.go:64] FLAG: --storage-driver-secure="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882609 4935 flags.go:64] FLAG: --storage-driver-table="stats" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882619 4935 flags.go:64] FLAG: --storage-driver-user="root" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882629 4935 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882641 4935 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882651 4935 flags.go:64] FLAG: --system-cgroups="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882662 4935 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882677 4935 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882687 4935 flags.go:64] FLAG: --tls-cert-file="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882697 4935 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882709 4935 flags.go:64] FLAG: --tls-min-version="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882718 4935 flags.go:64] FLAG: --tls-private-key-file="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882728 4935 flags.go:64] FLAG: --topology-manager-policy="none" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882738 4935 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882748 4935 flags.go:64] FLAG: --topology-manager-scope="container" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882757 4935 flags.go:64] FLAG: --v="2" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882770 4935 flags.go:64] FLAG: --version="false" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882784 4935 flags.go:64] FLAG: --vmodule="" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882796 4935 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.882807 4935 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883019 4935 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883032 4935 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883042 4935 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883051 4935 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883061 4935 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883070 4935 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883079 4935 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883088 4935 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883098 4935 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883106 4935 feature_gate.go:330] unrecognized feature gate: Example Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883115 4935 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883127 4935 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883137 4935 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883147 4935 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883156 4935 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883166 4935 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883175 4935 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883183 4935 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883192 4935 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883203 4935 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883213 4935 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883222 4935 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883232 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883241 4935 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883251 4935 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883260 4935 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883269 4935 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883304 4935 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883313 4935 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883322 4935 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883331 4935 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883340 4935 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883349 4935 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883359 4935 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883367 4935 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883377 4935 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883386 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883395 4935 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883405 4935 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883416 4935 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883427 4935 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883437 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883446 4935 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883455 4935 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883464 4935 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883473 4935 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883483 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883492 4935 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883501 4935 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883510 4935 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883519 4935 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883529 4935 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883538 4935 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883547 4935 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883555 4935 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883564 4935 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883573 4935 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883582 4935 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883590 4935 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883599 4935 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883607 4935 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883616 4935 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883625 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883633 4935 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883643 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883652 4935 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883661 4935 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883673 4935 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883684 4935 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883696 4935 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.883705 4935 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.883944 4935 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.895393 4935 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.896079 4935 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898052 4935 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898096 4935 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898111 4935 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898117 4935 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898124 4935 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898129 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898134 4935 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898139 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898152 4935 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898160 4935 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.898524 4935 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899040 4935 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899056 4935 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899062 4935 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899067 4935 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899072 4935 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899076 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899081 4935 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899085 4935 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899089 4935 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899094 4935 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899098 4935 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899103 4935 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899107 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899111 4935 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899115 4935 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899118 4935 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899122 4935 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899130 4935 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899137 4935 feature_gate.go:330] unrecognized feature gate: Example Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899142 4935 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899146 4935 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899150 4935 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899155 4935 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899166 4935 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899170 4935 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899174 4935 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899178 4935 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899181 4935 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899186 4935 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899190 4935 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899195 4935 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899198 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899202 4935 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899207 4935 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899211 4935 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899215 4935 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899219 4935 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899223 4935 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899226 4935 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899231 4935 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899235 4935 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899241 4935 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899246 4935 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899250 4935 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899254 4935 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899258 4935 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899262 4935 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899266 4935 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899295 4935 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899300 4935 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899303 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899307 4935 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899313 4935 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899317 4935 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899321 4935 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899325 4935 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899329 4935 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899333 4935 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899337 4935 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899342 4935 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.899351 4935 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899522 4935 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899530 4935 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899534 4935 feature_gate.go:330] unrecognized feature gate: Example Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899538 4935 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899581 4935 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899586 4935 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899591 4935 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899595 4935 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899599 4935 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899603 4935 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899607 4935 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899610 4935 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899615 4935 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899620 4935 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899623 4935 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899627 4935 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899631 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899634 4935 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899638 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899642 4935 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899646 4935 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899649 4935 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899653 4935 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899657 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899660 4935 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899664 4935 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899667 4935 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899673 4935 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899677 4935 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899681 4935 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899685 4935 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899688 4935 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899692 4935 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899696 4935 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899699 4935 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899703 4935 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899706 4935 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899710 4935 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899714 4935 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899718 4935 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899722 4935 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899725 4935 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899729 4935 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899733 4935 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899737 4935 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899741 4935 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899744 4935 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899748 4935 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899752 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899756 4935 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899759 4935 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899763 4935 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899767 4935 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899772 4935 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899776 4935 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899780 4935 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899783 4935 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899787 4935 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899790 4935 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899795 4935 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899799 4935 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899802 4935 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899813 4935 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899818 4935 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899823 4935 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899828 4935 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899832 4935 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899837 4935 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899841 4935 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899846 4935 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 17 09:04:40 crc kubenswrapper[4935]: W1217 09:04:40.899850 4935 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.899855 4935 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.900064 4935 server.go:940] "Client rotation is on, will bootstrap in background" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.916089 4935 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.916239 4935 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.916921 4935 server.go:997] "Starting client certificate rotation" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.916953 4935 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.917175 4935 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2026-01-03 04:02:18.981267582 +0000 UTC Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.917370 4935 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 402h57m38.063907482s for next certificate rotation Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.922804 4935 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.924625 4935 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 17 09:04:40 crc kubenswrapper[4935]: I1217 09:04:40.939632 4935 log.go:25] "Validated CRI v1 runtime API" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.011728 4935 log.go:25] "Validated CRI v1 image API" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.014192 4935 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.017181 4935 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-17-09-00-23-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.017215 4935 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.031651 4935 manager.go:217] Machine: {Timestamp:2025-12-17 09:04:41.029023795 +0000 UTC m=+0.688864588 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:a5a48762-63f5-465e-baf7-279b31b6b014 BootID:e24588ce-27b5-4ae2-a4f8-11ff903735be Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:df:c7:ec Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:df:c7:ec Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:56:bc:4c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:82:80:eb Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:aa:4b:3e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:8a:3f:2c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:8a:2a:82:5e:35:7c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:ce:8a:b0:59:18:49 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.032879 4935 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.033397 4935 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.034662 4935 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.035124 4935 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.035300 4935 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.035748 4935 topology_manager.go:138] "Creating topology manager with none policy" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.035860 4935 container_manager_linux.go:303] "Creating device plugin manager" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.036328 4935 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.036499 4935 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.036852 4935 state_mem.go:36] "Initialized new in-memory state store" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.037083 4935 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.038607 4935 kubelet.go:418] "Attempting to sync node with API server" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.038786 4935 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.038977 4935 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.039137 4935 kubelet.go:324] "Adding apiserver pod source" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.039260 4935 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.042072 4935 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.042737 4935 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.044577 4935 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045621 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045665 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045680 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045694 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045730 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045752 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045767 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045792 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045813 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045837 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045903 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.045918 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.046863 4935 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 17 09:04:41 crc kubenswrapper[4935]: W1217 09:04:41.046862 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:41 crc kubenswrapper[4935]: W1217 09:04:41.046875 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.046993 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.047016 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.048078 4935 server.go:1280] "Started kubelet" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.048190 4935 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.049039 4935 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.048719 4935 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.049648 4935 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.049693 4935 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.049720 4935 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 15:32:45.147883821 +0000 UTC Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.049777 4935 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 390h28m4.098111172s for next certificate rotation Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.049788 4935 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.049811 4935 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.050003 4935 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.049809 4935 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.050927 4935 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.050946 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Dec 17 09:04:41 crc systemd[1]: Started Kubernetes Kubelet. Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.051521 4935 factory.go:55] Registering systemd factory Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.051555 4935 factory.go:221] Registration of the systemd container factory successfully Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.052399 4935 factory.go:153] Registering CRI-O factory Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.052465 4935 factory.go:221] Registration of the crio container factory successfully Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.052588 4935 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.052637 4935 factory.go:103] Registering Raw factory Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.052672 4935 manager.go:1196] Started watching for new ooms in manager Dec 17 09:04:41 crc kubenswrapper[4935]: W1217 09:04:41.053238 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.053353 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.054144 4935 manager.go:319] Starting recovery of all containers Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.054172 4935 server.go:460] "Adding debug handlers to kubelet server" Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.064361 4935 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1881f5569092f882 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-17 09:04:41.047488642 +0000 UTC m=+0.707329415,LastTimestamp:2025-12-17 09:04:41.047488642 +0000 UTC m=+0.707329415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.084385 4935 manager.go:324] Recovery completed Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.096945 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.099499 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.099563 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.099580 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.100698 4935 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.100724 4935 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.100750 4935 state_mem.go:36] "Initialized new in-memory state store" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108106 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108164 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108176 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108189 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108200 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108212 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108225 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108238 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108254 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108265 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108302 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108312 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108322 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108335 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108347 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108356 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108367 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108593 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108601 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108611 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108619 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108629 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108639 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108647 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108656 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108666 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108679 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108690 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108698 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108707 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108725 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108736 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108745 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108757 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108772 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108792 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108803 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108815 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108826 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108837 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108847 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108857 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108869 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108879 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108890 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108903 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108914 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108926 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108939 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108954 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108965 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108978 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108989 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.108999 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109008 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109018 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109028 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109038 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109048 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109058 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109068 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109077 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109086 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109098 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109188 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109205 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109214 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109224 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109234 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109243 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109253 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109263 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109293 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109303 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109312 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109322 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109331 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109339 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109353 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109362 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109372 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109382 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109392 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109401 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109410 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109418 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109428 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109437 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109447 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109456 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109464 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109473 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109482 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109491 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109500 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109508 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109517 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109526 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109535 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109545 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109553 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109562 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109571 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109580 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109593 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109602 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109612 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109622 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109631 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109641 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109653 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109662 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.109673 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110321 4935 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110355 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110367 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110377 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110388 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110398 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110408 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110417 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110427 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110437 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110445 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110455 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110464 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110474 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110484 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110493 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110502 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110511 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110519 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110528 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110536 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110545 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110554 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110562 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110570 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110579 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110589 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110597 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110607 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110618 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110626 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110636 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110647 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110658 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110666 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110676 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110685 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110694 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110703 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110711 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110719 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110727 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110736 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110744 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110752 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110761 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110770 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110779 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110788 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110797 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110806 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110814 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110823 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110832 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110843 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110862 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110880 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110891 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110901 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110912 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110925 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110935 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110944 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110964 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110977 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110986 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.110996 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111004 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111013 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111022 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111032 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111040 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111050 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111059 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111068 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111077 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111086 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111095 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111116 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111133 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111146 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111156 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111164 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111172 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111182 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111192 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111202 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111210 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111219 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111230 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111240 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111250 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111262 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111293 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111303 4935 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111313 4935 reconstruct.go:97] "Volume reconstruction finished" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.111321 4935 reconciler.go:26] "Reconciler: start to sync state" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.120620 4935 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.122805 4935 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.122852 4935 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.122880 4935 kubelet.go:2335] "Starting kubelet main sync loop" Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.122925 4935 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 17 09:04:41 crc kubenswrapper[4935]: W1217 09:04:41.123927 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.124091 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.143498 4935 policy_none.go:49] "None policy: Start" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.148619 4935 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.148674 4935 state_mem.go:35] "Initializing new in-memory state store" Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.150194 4935 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.221287 4935 manager.go:334] "Starting Device Plugin manager" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.221368 4935 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.221383 4935 server.go:79] "Starting device plugin registration server" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.221796 4935 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.221813 4935 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.221986 4935 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.222146 4935 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.222162 4935 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.223044 4935 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.223242 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.224871 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.224912 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.224926 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.225138 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.225447 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.225495 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.226040 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.226098 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.226113 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.226323 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.226358 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.226415 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.226798 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.226874 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.226892 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.228104 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.228125 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.228157 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.228174 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.228130 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.228326 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.228459 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.228576 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.228602 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.229119 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.229141 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.229153 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.229331 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.229531 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.229568 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.229736 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.229753 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.229764 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.231478 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.231512 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.231487 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.231567 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.231583 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.231525 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.231893 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.231943 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.233255 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.233308 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.233320 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.236142 4935 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.252079 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314180 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314231 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314262 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314358 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314418 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314643 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314751 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314822 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314863 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314884 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314904 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314940 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.314976 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.315003 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.315027 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.322645 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.323615 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.323645 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.323655 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.323677 4935 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.324181 4935 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416062 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416132 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416337 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416417 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416336 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416381 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416544 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416580 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416626 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416641 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416658 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416692 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416703 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416731 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416741 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416778 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416789 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416807 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416832 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416834 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416880 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416904 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416918 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416946 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416970 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.417011 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.417050 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.416878 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.417102 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.417160 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.525224 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.526810 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.526849 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.526866 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.526895 4935 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.527381 4935 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.559130 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.582454 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.600539 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.609090 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.615160 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 17 09:04:41 crc kubenswrapper[4935]: W1217 09:04:41.620923 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-929bfac083f8673a285681bd655d40d4ca71e917a75a04a219445797689eb52f WatchSource:0}: Error finding container 929bfac083f8673a285681bd655d40d4ca71e917a75a04a219445797689eb52f: Status 404 returned error can't find the container with id 929bfac083f8673a285681bd655d40d4ca71e917a75a04a219445797689eb52f Dec 17 09:04:41 crc kubenswrapper[4935]: W1217 09:04:41.644084 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-e48c177e51bccfa0eee7e7c1a5fc8448ca281dbef5da41dfccb663fbfb76c678 WatchSource:0}: Error finding container e48c177e51bccfa0eee7e7c1a5fc8448ca281dbef5da41dfccb663fbfb76c678: Status 404 returned error can't find the container with id e48c177e51bccfa0eee7e7c1a5fc8448ca281dbef5da41dfccb663fbfb76c678 Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.652947 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.928402 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.929942 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.930013 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.930033 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:41 crc kubenswrapper[4935]: I1217 09:04:41.930078 4935 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 17 09:04:41 crc kubenswrapper[4935]: E1217 09:04:41.930851 4935 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Dec 17 09:04:42 crc kubenswrapper[4935]: W1217 09:04:42.024171 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:42 crc kubenswrapper[4935]: E1217 09:04:42.024305 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.052679 4935 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.127825 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"8f5541035e07a213bcaaa008f8e69674300353bc6e35f6097e1f01cff3169985"} Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.128926 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e48c177e51bccfa0eee7e7c1a5fc8448ca281dbef5da41dfccb663fbfb76c678"} Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.130019 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d75f4390ba16fe5c9bec2933122acf14d88d138e7d38ec4e622bef48971f48fa"} Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.130978 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ab450b7c7d400081c045c99f55713976e6debdc6833d3fcbb677af6122d04f8b"} Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.131806 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"929bfac083f8673a285681bd655d40d4ca71e917a75a04a219445797689eb52f"} Dec 17 09:04:42 crc kubenswrapper[4935]: W1217 09:04:42.240876 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:42 crc kubenswrapper[4935]: E1217 09:04:42.240970 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:42 crc kubenswrapper[4935]: E1217 09:04:42.454145 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Dec 17 09:04:42 crc kubenswrapper[4935]: W1217 09:04:42.620442 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:42 crc kubenswrapper[4935]: E1217 09:04:42.620839 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:42 crc kubenswrapper[4935]: W1217 09:04:42.679239 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:42 crc kubenswrapper[4935]: E1217 09:04:42.679364 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.732076 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.734689 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.734765 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.734782 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:42 crc kubenswrapper[4935]: I1217 09:04:42.734822 4935 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 17 09:04:42 crc kubenswrapper[4935]: E1217 09:04:42.735678 4935 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.052156 4935 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.137513 4935 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="41e323565ba951afc21c04e5fb8f9a05915dd0cafca9403aed4b57235a79b79c" exitCode=0 Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.137654 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"41e323565ba951afc21c04e5fb8f9a05915dd0cafca9403aed4b57235a79b79c"} Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.137671 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.139516 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.139645 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.139728 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.140678 4935 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814" exitCode=0 Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.140836 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814"} Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.140868 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.143858 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.143946 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.143990 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.148839 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0"} Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.148910 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5"} Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.152866 4935 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5" exitCode=0 Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.152998 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5"} Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.153037 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.155206 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.155247 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.155262 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.155800 4935 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="613b0a2b83a81ed158edbec6c5d9c3ae8e62762689addec6701a2efba44157ef" exitCode=0 Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.155839 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"613b0a2b83a81ed158edbec6c5d9c3ae8e62762689addec6701a2efba44157ef"} Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.156651 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.160662 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.160718 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.160737 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.162568 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.163897 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.163954 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:43 crc kubenswrapper[4935]: I1217 09:04:43.163972 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:43 crc kubenswrapper[4935]: E1217 09:04:43.379086 4935 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1881f5569092f882 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-17 09:04:41.047488642 +0000 UTC m=+0.707329415,LastTimestamp:2025-12-17 09:04:41.047488642 +0000 UTC m=+0.707329415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 17 09:04:43 crc kubenswrapper[4935]: W1217 09:04:43.890206 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:43 crc kubenswrapper[4935]: E1217 09:04:43.890312 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.051966 4935 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:44 crc kubenswrapper[4935]: E1217 09:04:44.055601 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.174232 4935 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="28a3351e82d263f370d04ce83967539e2a4f810a8a0d54edf796d103da481549" exitCode=0 Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.174341 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"28a3351e82d263f370d04ce83967539e2a4f810a8a0d54edf796d103da481549"} Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.174505 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.175265 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.175323 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.175335 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.177186 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1208e92f78ba64eaec2622c3170a7aef4746e6fb70ce88660f0384ecd0452f63"} Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.177310 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.178684 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.178704 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.178714 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.179759 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97"} Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.179811 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709"} Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.182333 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915"} Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.182360 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.182388 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca"} Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.183246 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.183295 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.183317 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.184398 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89"} Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.184424 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696"} Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.336064 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.337586 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.337627 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.337639 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.337667 4935 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 17 09:04:44 crc kubenswrapper[4935]: E1217 09:04:44.338171 4935 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Dec 17 09:04:44 crc kubenswrapper[4935]: I1217 09:04:44.975094 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.052239 4935 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:45 crc kubenswrapper[4935]: W1217 09:04:45.090714 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:45 crc kubenswrapper[4935]: E1217 09:04:45.090867 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.191020 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5"} Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.191110 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.194405 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.194455 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.194473 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.199479 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4"} Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.199545 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5"} Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.199561 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902"} Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.199582 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.200944 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.200984 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.200998 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.202033 4935 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="42947940bb8257ece6ca414f9bef1319fe5f55eced777de6246c7dfb19c40244" exitCode=0 Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.202112 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"42947940bb8257ece6ca414f9bef1319fe5f55eced777de6246c7dfb19c40244"} Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.202146 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.202192 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.202223 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.203615 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.203651 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.203665 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.204951 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.204980 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.204993 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.206431 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.206455 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:45 crc kubenswrapper[4935]: I1217 09:04:45.206465 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:45 crc kubenswrapper[4935]: W1217 09:04:45.404247 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:45 crc kubenswrapper[4935]: E1217 09:04:45.404371 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:45 crc kubenswrapper[4935]: W1217 09:04:45.807607 4935 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:45 crc kubenswrapper[4935]: E1217 09:04:45.807754 4935 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.052410 4935 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.070362 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.077375 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.209562 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8a76ff07cf05b818a0af2792d65571853430242fcf1a4770cd3400f4f603da11"} Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.209608 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.209645 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.209686 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.209711 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3348f3b3870ff8198db280c9e6562fdbaa39f72175eea648cfa9ea14a6bda53e"} Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.209750 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.209762 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.210647 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.210675 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.210686 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.212136 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.212176 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.212189 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.213081 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.213118 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.213132 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.524398 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:46 crc kubenswrapper[4935]: I1217 09:04:46.864928 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.216100 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.219498 4935 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4" exitCode=255 Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.219602 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4"} Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.219705 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.221100 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.221193 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.221219 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.222589 4935 scope.go:117] "RemoveContainer" containerID="56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.228666 4935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.228760 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.228674 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.228791 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.229142 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e2e09723f4cd342ee73f50119edee86cd6620f3193a450fc7465bab486757d29"} Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.229359 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e31499d75b952c73a408ee75a59979bbc96d6a0826c2c00d5fe7a37e58e608ee"} Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.229405 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"12a60f36fc8bf3a4d665ed90e4e977daad5dfa46e6ddafcfacb8205f94f067a4"} Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.230703 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.230739 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.230703 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.230778 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.230805 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.230825 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.230841 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.230755 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.230849 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.538193 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.538349 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.540442 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.540522 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.540576 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.540633 4935 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.720415 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.975401 4935 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 17 09:04:47 crc kubenswrapper[4935]: I1217 09:04:47.975522 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.234150 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.236660 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209"} Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.236719 4935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.236767 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.236853 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.237457 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.238051 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.238081 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.238091 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.238266 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.238314 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.238326 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.239079 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.239122 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:48 crc kubenswrapper[4935]: I1217 09:04:48.239135 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.239121 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.239223 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.239696 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.240177 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.240253 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.240300 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.240711 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.240735 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.240745 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.642009 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.642317 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.643920 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.644092 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:49 crc kubenswrapper[4935]: I1217 09:04:49.644216 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:50 crc kubenswrapper[4935]: I1217 09:04:50.242248 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:50 crc kubenswrapper[4935]: I1217 09:04:50.244089 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:50 crc kubenswrapper[4935]: I1217 09:04:50.244149 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:50 crc kubenswrapper[4935]: I1217 09:04:50.244162 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:50 crc kubenswrapper[4935]: I1217 09:04:50.737403 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 17 09:04:50 crc kubenswrapper[4935]: I1217 09:04:50.737699 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:50 crc kubenswrapper[4935]: I1217 09:04:50.739874 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:50 crc kubenswrapper[4935]: I1217 09:04:50.739941 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:50 crc kubenswrapper[4935]: I1217 09:04:50.739961 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:51 crc kubenswrapper[4935]: E1217 09:04:51.237227 4935 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 17 09:04:56 crc kubenswrapper[4935]: I1217 09:04:56.791337 4935 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 17 09:04:56 crc kubenswrapper[4935]: I1217 09:04:56.791431 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 17 09:04:56 crc kubenswrapper[4935]: I1217 09:04:56.796829 4935 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 17 09:04:56 crc kubenswrapper[4935]: I1217 09:04:56.797137 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 17 09:04:56 crc kubenswrapper[4935]: I1217 09:04:56.872269 4935 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]log ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]etcd ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/generic-apiserver-start-informers ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/priority-and-fairness-filter ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/start-apiextensions-informers ok Dec 17 09:04:56 crc kubenswrapper[4935]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Dec 17 09:04:56 crc kubenswrapper[4935]: [-]poststarthook/crd-informer-synced failed: reason withheld Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/start-system-namespaces-controller ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 17 09:04:56 crc kubenswrapper[4935]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 17 09:04:56 crc kubenswrapper[4935]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/bootstrap-controller ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/start-kube-aggregator-informers ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 17 09:04:56 crc kubenswrapper[4935]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 17 09:04:56 crc kubenswrapper[4935]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]autoregister-completion ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/apiservice-openapi-controller ok Dec 17 09:04:56 crc kubenswrapper[4935]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 17 09:04:56 crc kubenswrapper[4935]: livez check failed Dec 17 09:04:56 crc kubenswrapper[4935]: I1217 09:04:56.872354 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:04:57 crc kubenswrapper[4935]: I1217 09:04:57.780720 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 17 09:04:57 crc kubenswrapper[4935]: I1217 09:04:57.780951 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:57 crc kubenswrapper[4935]: I1217 09:04:57.782685 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:57 crc kubenswrapper[4935]: I1217 09:04:57.782747 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:57 crc kubenswrapper[4935]: I1217 09:04:57.782764 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:57 crc kubenswrapper[4935]: I1217 09:04:57.804213 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 17 09:04:57 crc kubenswrapper[4935]: I1217 09:04:57.975987 4935 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 17 09:04:57 crc kubenswrapper[4935]: I1217 09:04:57.976058 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 17 09:04:58 crc kubenswrapper[4935]: I1217 09:04:58.264507 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:58 crc kubenswrapper[4935]: I1217 09:04:58.265395 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:58 crc kubenswrapper[4935]: I1217 09:04:58.265456 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:58 crc kubenswrapper[4935]: I1217 09:04:58.265478 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:04:59 crc kubenswrapper[4935]: I1217 09:04:59.647155 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:04:59 crc kubenswrapper[4935]: I1217 09:04:59.647395 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:04:59 crc kubenswrapper[4935]: I1217 09:04:59.648596 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:04:59 crc kubenswrapper[4935]: I1217 09:04:59.648629 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:04:59 crc kubenswrapper[4935]: I1217 09:04:59.648640 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:01 crc kubenswrapper[4935]: E1217 09:05:01.238062 4935 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 17 09:05:01 crc kubenswrapper[4935]: E1217 09:05:01.790704 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.794551 4935 trace.go:236] Trace[1115747819]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Dec-2025 09:04:49.216) (total time: 12577ms): Dec 17 09:05:01 crc kubenswrapper[4935]: Trace[1115747819]: ---"Objects listed" error: 12577ms (09:05:01.794) Dec 17 09:05:01 crc kubenswrapper[4935]: Trace[1115747819]: [12.577563695s] [12.577563695s] END Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.794590 4935 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.795097 4935 trace.go:236] Trace[402703855]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Dec-2025 09:04:51.021) (total time: 10773ms): Dec 17 09:05:01 crc kubenswrapper[4935]: Trace[402703855]: ---"Objects listed" error: 10772ms (09:05:01.794) Dec 17 09:05:01 crc kubenswrapper[4935]: Trace[402703855]: [10.773022819s] [10.773022819s] END Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.795147 4935 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.795591 4935 trace.go:236] Trace[2060677547]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Dec-2025 09:04:48.141) (total time: 13654ms): Dec 17 09:05:01 crc kubenswrapper[4935]: Trace[2060677547]: ---"Objects listed" error: 13654ms (09:05:01.795) Dec 17 09:05:01 crc kubenswrapper[4935]: Trace[2060677547]: [13.654187956s] [13.654187956s] END Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.795630 4935 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.796523 4935 trace.go:236] Trace[2032342944]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Dec-2025 09:04:50.216) (total time: 11579ms): Dec 17 09:05:01 crc kubenswrapper[4935]: Trace[2032342944]: ---"Objects listed" error: 11579ms (09:05:01.796) Dec 17 09:05:01 crc kubenswrapper[4935]: Trace[2032342944]: [11.579937287s] [11.579937287s] END Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.796547 4935 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 17 09:05:01 crc kubenswrapper[4935]: E1217 09:05:01.798147 4935 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.801018 4935 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.870140 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:05:01 crc kubenswrapper[4935]: I1217 09:05:01.874088 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.050497 4935 apiserver.go:52] "Watching apiserver" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.053159 4935 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.053849 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.054251 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.054261 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.054532 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.054849 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.054928 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.054989 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.055086 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.055224 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.055294 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.056628 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.056635 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.056680 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.057057 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.057212 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.057601 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.057795 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.058230 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.060532 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.098565 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.110335 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.113103 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.123815 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.139296 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.150828 4935 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.151527 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.163624 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.174402 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.185987 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.198124 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203322 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203361 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203384 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203410 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203429 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203451 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203474 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203491 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203511 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203527 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203549 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203570 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203590 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203609 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203632 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203653 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203866 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203888 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203907 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203924 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203944 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203971 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203981 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203976 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.203990 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204070 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204055 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204075 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204077 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204091 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204192 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204222 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204244 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204264 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204339 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204362 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204422 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204495 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204556 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204526 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204559 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204761 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204767 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204797 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204804 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204817 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205024 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205029 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205039 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205064 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205050 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.204300 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205143 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205171 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205191 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205234 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205253 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205263 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205300 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205321 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205340 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205359 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205378 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205396 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205442 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205464 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205485 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205494 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205504 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205529 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205550 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205576 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205634 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205647 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205654 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205705 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205715 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205750 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205783 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205813 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205840 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205860 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205880 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205901 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205926 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205949 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205971 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205994 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206015 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206033 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206052 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206073 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206094 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206115 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206134 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206151 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206171 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206191 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206209 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206229 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206249 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206266 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206306 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206333 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206356 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206376 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206393 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206413 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206434 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206450 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206468 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206487 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206506 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206524 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206543 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206560 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206584 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206601 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206620 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206637 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206651 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206671 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206693 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206712 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206734 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206751 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206769 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206787 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206806 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206824 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206842 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206862 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206885 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206956 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206994 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207060 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207078 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207097 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207117 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207141 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207158 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207235 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207661 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207989 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208042 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208068 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208094 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208114 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208135 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208156 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208180 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208207 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208228 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208252 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208289 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208322 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208350 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208379 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208406 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208476 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208502 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208526 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208553 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208577 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208603 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208629 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208655 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208680 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208705 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208730 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208754 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208784 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208809 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208834 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208858 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208891 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208937 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208965 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208991 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209017 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209041 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209066 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209091 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209114 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209140 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209166 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209191 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209215 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209239 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209264 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209321 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209351 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209375 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209398 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209426 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209453 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210162 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210204 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210231 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210258 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210310 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210337 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210365 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210391 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210419 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210445 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210474 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210501 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210533 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210560 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210585 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210614 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210689 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210717 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210743 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210773 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210800 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210827 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210852 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210926 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211001 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211027 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211095 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211191 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211226 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211261 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211315 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211344 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211372 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211401 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211425 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211462 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211496 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211526 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211554 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211584 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211977 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211997 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212013 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212045 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212061 4935 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212077 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212096 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212110 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212125 4935 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212140 4935 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212155 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212172 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212188 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212204 4935 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212218 4935 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212232 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212248 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212263 4935 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212297 4935 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212312 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212326 4935 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212341 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212356 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212371 4935 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212395 4935 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212410 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212424 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.213242 4935 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.219127 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205782 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.205952 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206009 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206067 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206114 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206171 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206192 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206332 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206456 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206588 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.219832 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206647 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206700 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206829 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206844 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.206963 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207102 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207146 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207199 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207345 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207568 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207588 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207716 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207796 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.207862 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208001 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208657 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208687 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.208734 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209021 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209121 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209266 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209324 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209400 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209425 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.209956 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210358 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210435 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210541 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210711 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.210818 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211143 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.211992 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212192 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212707 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.212769 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.213066 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.213320 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.213435 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.213602 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:05:02.713508685 +0000 UTC m=+22.373349448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.220912 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.220918 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.221677 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.221793 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.221979 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.213699 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.213858 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.214074 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.214093 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.214148 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.214299 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.214398 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.214426 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.222684 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.214529 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.214727 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.214759 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.214930 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.215141 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.215046 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.215201 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.215458 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.215567 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.215598 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.215782 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.215956 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.216398 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.217116 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.217674 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.218638 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.218648 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.218711 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.218918 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.219259 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.219652 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.222250 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.223077 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.223425 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.223693 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.224184 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.224833 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.225440 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.225779 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.226013 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.226215 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.226301 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.226543 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.226767 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.226966 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.227094 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.227290 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.227342 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.227404 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.228524 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.228634 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.228642 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.228799 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.229172 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.229324 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.229889 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.230265 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.231039 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.232010 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.232558 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.232905 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.232946 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.233073 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.233107 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.235246 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.242114 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.243005 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.243331 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.243713 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.251344 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.251460 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.251611 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.251679 4935 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.251723 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:02.751708429 +0000 UTC m=+22.411549192 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.251955 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.252022 4935 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.252072 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:02.752064649 +0000 UTC m=+22.411905412 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.252288 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.252303 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.252313 4935 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.252350 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:02.752341766 +0000 UTC m=+22.412182529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.252430 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.252756 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.252981 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.253198 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.253257 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.253937 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.254441 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.254924 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.258773 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.259146 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.260406 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.260763 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.261676 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.261943 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.262142 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.213827 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.267457 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.267921 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.268025 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.268144 4935 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.268040 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.268259 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:02.768232892 +0000 UTC m=+22.428073655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.268448 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.268789 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.268827 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.268868 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.270194 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.272187 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.272952 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.275578 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.276652 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.276725 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.276979 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.276995 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.277314 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.281519 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.281703 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.281792 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.282240 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.283082 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.284908 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.285337 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.287190 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.288970 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.289798 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.290485 4935 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.293370 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.295621 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.295703 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.295766 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.297241 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.297379 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.304020 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.309337 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.311887 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313691 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313733 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313805 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313817 4935 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313826 4935 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313835 4935 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313844 4935 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313853 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313861 4935 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313871 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313879 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313887 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313896 4935 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313905 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313913 4935 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313922 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313930 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313939 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313948 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313956 4935 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313964 4935 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313971 4935 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.313980 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314054 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314099 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314116 4935 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314125 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314134 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314143 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314152 4935 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314161 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314170 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314178 4935 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314186 4935 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314196 4935 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314206 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314213 4935 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314223 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314232 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314241 4935 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314249 4935 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314258 4935 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314266 4935 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314293 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314304 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314313 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314323 4935 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314331 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314340 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314348 4935 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314358 4935 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314366 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314374 4935 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314382 4935 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314390 4935 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314399 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314406 4935 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314413 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314421 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314429 4935 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314437 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314446 4935 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314454 4935 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314462 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314471 4935 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314479 4935 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314487 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314495 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314504 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314512 4935 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314521 4935 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314529 4935 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314537 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314546 4935 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314554 4935 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314562 4935 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314572 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314581 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314592 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314601 4935 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314611 4935 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314618 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314626 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314634 4935 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314642 4935 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314650 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314658 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314666 4935 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314674 4935 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314683 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314691 4935 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314700 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314708 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314717 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314725 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314734 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314742 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314750 4935 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314759 4935 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314767 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314776 4935 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314785 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314794 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314803 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314813 4935 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314823 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314831 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314840 4935 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314849 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314858 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314868 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314877 4935 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314887 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314896 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314904 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314912 4935 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314920 4935 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314929 4935 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314938 4935 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314946 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314955 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314963 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314971 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314979 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314986 4935 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.314994 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315002 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315010 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315019 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315027 4935 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315035 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315043 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315051 4935 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315059 4935 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315068 4935 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315076 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315085 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315092 4935 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315100 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315107 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315115 4935 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315123 4935 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315131 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315138 4935 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315146 4935 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315154 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315163 4935 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315171 4935 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315178 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315186 4935 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315194 4935 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315202 4935 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315209 4935 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315221 4935 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315229 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315237 4935 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315245 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315253 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315261 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315268 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315293 4935 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315301 4935 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315310 4935 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315319 4935 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315327 4935 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315335 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315343 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315352 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.315360 4935 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.317260 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.317406 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.322410 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.331351 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.367421 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 17 09:05:02 crc kubenswrapper[4935]: W1217 09:05:02.378775 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-add396786bc8d3360d728142c34975056d6fdf8ae2c9697f3578d0f84261f57f WatchSource:0}: Error finding container add396786bc8d3360d728142c34975056d6fdf8ae2c9697f3578d0f84261f57f: Status 404 returned error can't find the container with id add396786bc8d3360d728142c34975056d6fdf8ae2c9697f3578d0f84261f57f Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.382394 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.391128 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 17 09:05:02 crc kubenswrapper[4935]: W1217 09:05:02.394711 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-aeee35a533f10292a9717fe8e4d7aaf15ec47ed7324c7b2cf617860d3cdc9b71 WatchSource:0}: Error finding container aeee35a533f10292a9717fe8e4d7aaf15ec47ed7324c7b2cf617860d3cdc9b71: Status 404 returned error can't find the container with id aeee35a533f10292a9717fe8e4d7aaf15ec47ed7324c7b2cf617860d3cdc9b71 Dec 17 09:05:02 crc kubenswrapper[4935]: W1217 09:05:02.409291 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-281e5d843dc99070d6b87c5b28f799627d7f3ef8b8ef4504273ade623efcbde1 WatchSource:0}: Error finding container 281e5d843dc99070d6b87c5b28f799627d7f3ef8b8ef4504273ade623efcbde1: Status 404 returned error can't find the container with id 281e5d843dc99070d6b87c5b28f799627d7f3ef8b8ef4504273ade623efcbde1 Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.415681 4935 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.415713 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.718511 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.718660 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:05:03.718640907 +0000 UTC m=+23.378481670 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.820011 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.820128 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820166 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.820176 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820187 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820199 4935 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:02 crc kubenswrapper[4935]: I1217 09:05:02.820204 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820245 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:03.820229265 +0000 UTC m=+23.480070028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820369 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820378 4935 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820427 4935 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820390 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820475 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:03.820457402 +0000 UTC m=+23.480298165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820478 4935 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820520 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:03.820509673 +0000 UTC m=+23.480350506 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:02 crc kubenswrapper[4935]: E1217 09:05:02.820542 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:03.820532154 +0000 UTC m=+23.480373127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.128641 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.129581 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.131058 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.132120 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.134311 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.135127 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.136992 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.138011 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.140402 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.141427 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.142843 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.144662 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.146045 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.146881 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.147938 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.148577 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.149158 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.149943 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.150528 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.151079 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.159789 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.160916 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.161922 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.162690 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.163152 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.164183 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.165445 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.166048 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.166742 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.167815 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.168424 4935 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.168579 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.171266 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.171976 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.172520 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.174191 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.175705 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.176257 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.177360 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.182889 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.183717 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.185044 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.186233 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.187245 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.188106 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.189175 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.189718 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.191089 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.191564 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.192554 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.193342 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.194599 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.195473 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.196155 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.292521 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b"} Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.292627 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38"} Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.292644 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"281e5d843dc99070d6b87c5b28f799627d7f3ef8b8ef4504273ade623efcbde1"} Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.293557 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"aeee35a533f10292a9717fe8e4d7aaf15ec47ed7324c7b2cf617860d3cdc9b71"} Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.296029 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b"} Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.296124 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"add396786bc8d3360d728142c34975056d6fdf8ae2c9697f3578d0f84261f57f"} Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.315537 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.353905 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.369981 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.384315 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.406830 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.423123 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.438056 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.452605 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.472266 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.488158 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.508479 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.522946 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.533795 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.548850 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.618218 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bw8z8"] Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.619003 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bw8z8" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.619117 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-n6z48"] Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.619519 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.621872 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.622366 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.626500 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.626570 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.627656 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.627729 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.628549 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.646309 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.666729 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.697192 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.721084 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.731117 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.731267 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:05:05.73124072 +0000 UTC m=+25.391081483 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.731415 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c716f0c7-850f-4cc4-bd28-5a2807f126a3-serviceca\") pod \"node-ca-n6z48\" (UID: \"c716f0c7-850f-4cc4-bd28-5a2807f126a3\") " pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.731538 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvmjl\" (UniqueName: \"kubernetes.io/projected/c716f0c7-850f-4cc4-bd28-5a2807f126a3-kube-api-access-mvmjl\") pod \"node-ca-n6z48\" (UID: \"c716f0c7-850f-4cc4-bd28-5a2807f126a3\") " pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.731589 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c716f0c7-850f-4cc4-bd28-5a2807f126a3-host\") pod \"node-ca-n6z48\" (UID: \"c716f0c7-850f-4cc4-bd28-5a2807f126a3\") " pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.731620 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339-hosts-file\") pod \"node-resolver-bw8z8\" (UID: \"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\") " pod="openshift-dns/node-resolver-bw8z8" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.731702 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lfmh\" (UniqueName: \"kubernetes.io/projected/a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339-kube-api-access-9lfmh\") pod \"node-resolver-bw8z8\" (UID: \"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\") " pod="openshift-dns/node-resolver-bw8z8" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.739289 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.764873 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.780184 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.794199 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.808748 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.825883 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832189 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c716f0c7-850f-4cc4-bd28-5a2807f126a3-host\") pod \"node-ca-n6z48\" (UID: \"c716f0c7-850f-4cc4-bd28-5a2807f126a3\") " pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832250 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832293 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lfmh\" (UniqueName: \"kubernetes.io/projected/a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339-kube-api-access-9lfmh\") pod \"node-resolver-bw8z8\" (UID: \"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\") " pod="openshift-dns/node-resolver-bw8z8" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832318 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339-hosts-file\") pod \"node-resolver-bw8z8\" (UID: \"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\") " pod="openshift-dns/node-resolver-bw8z8" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832344 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832351 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c716f0c7-850f-4cc4-bd28-5a2807f126a3-host\") pod \"node-ca-n6z48\" (UID: \"c716f0c7-850f-4cc4-bd28-5a2807f126a3\") " pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832368 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c716f0c7-850f-4cc4-bd28-5a2807f126a3-serviceca\") pod \"node-ca-n6z48\" (UID: \"c716f0c7-850f-4cc4-bd28-5a2807f126a3\") " pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832474 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832506 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832536 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvmjl\" (UniqueName: \"kubernetes.io/projected/c716f0c7-850f-4cc4-bd28-5a2807f126a3-kube-api-access-mvmjl\") pod \"node-ca-n6z48\" (UID: \"c716f0c7-850f-4cc4-bd28-5a2807f126a3\") " pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.832528 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339-hosts-file\") pod \"node-resolver-bw8z8\" (UID: \"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\") " pod="openshift-dns/node-resolver-bw8z8" Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832558 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832621 4935 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832639 4935 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832696 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:05.832669105 +0000 UTC m=+25.492510038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832653 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832756 4935 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832776 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:05.832744886 +0000 UTC m=+25.492585649 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832841 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:05.832797048 +0000 UTC m=+25.492637801 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832866 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832900 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832914 4935 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:03 crc kubenswrapper[4935]: E1217 09:05:03.832944 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:05.832936181 +0000 UTC m=+25.492776944 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.833922 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c716f0c7-850f-4cc4-bd28-5a2807f126a3-serviceca\") pod \"node-ca-n6z48\" (UID: \"c716f0c7-850f-4cc4-bd28-5a2807f126a3\") " pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.841881 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.859701 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvmjl\" (UniqueName: \"kubernetes.io/projected/c716f0c7-850f-4cc4-bd28-5a2807f126a3-kube-api-access-mvmjl\") pod \"node-ca-n6z48\" (UID: \"c716f0c7-850f-4cc4-bd28-5a2807f126a3\") " pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.861124 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lfmh\" (UniqueName: \"kubernetes.io/projected/a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339-kube-api-access-9lfmh\") pod \"node-resolver-bw8z8\" (UID: \"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\") " pod="openshift-dns/node-resolver-bw8z8" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.866236 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.889544 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.904448 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.922294 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.934662 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bw8z8" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.939290 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n6z48" Dec 17 09:05:03 crc kubenswrapper[4935]: I1217 09:05:03.965621 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.000224 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:03Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.123760 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.123832 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:04 crc kubenswrapper[4935]: E1217 09:05:04.123888 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:04 crc kubenswrapper[4935]: E1217 09:05:04.123990 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.124037 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:04 crc kubenswrapper[4935]: E1217 09:05:04.124099 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.298921 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n6z48" event={"ID":"c716f0c7-850f-4cc4-bd28-5a2807f126a3","Type":"ContainerStarted","Data":"46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c"} Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.298981 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n6z48" event={"ID":"c716f0c7-850f-4cc4-bd28-5a2807f126a3","Type":"ContainerStarted","Data":"a119e9b95820be24fe9d46d4fda2dfefe3718ab1ab81fdbaaa26ae7f30b23a28"} Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.300406 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bw8z8" event={"ID":"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339","Type":"ContainerStarted","Data":"ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2"} Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.300432 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bw8z8" event={"ID":"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339","Type":"ContainerStarted","Data":"2945a0c3730a5057f0d80a5f99f5c316c3d84d5d8b56488024667490da6225a7"} Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.315652 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.346345 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.369066 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.381001 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.394362 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.413522 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-k7lhw"] Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.413963 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-jrmtf"] Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.414114 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-qzmn2"] Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.414122 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.414198 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.414793 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.415116 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.416712 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.417480 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.417735 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.417922 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.418036 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.418036 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.418208 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.418590 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.421174 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.421327 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.433538 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.434798 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.435332 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.454493 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.496682 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.519491 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.538726 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-cni-dir\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.538776 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-conf-dir\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.538860 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-system-cni-dir\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.538972 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d8b2226-e518-487d-967a-78cbfd4da1dc-proxy-tls\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539028 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-cnibin\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539057 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-run-netns\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539091 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-var-lib-kubelet\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539125 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-cni-binary-copy\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539167 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-system-cni-dir\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539187 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-etc-kubernetes\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539210 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-os-release\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539250 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-hostroot\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539291 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-run-multus-certs\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539311 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqxq4\" (UniqueName: \"kubernetes.io/projected/8b52811a-aff2-43c1-9074-f0654f991d9c-kube-api-access-dqxq4\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539345 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-os-release\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539381 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b52811a-aff2-43c1-9074-f0654f991d9c-cni-binary-copy\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539399 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-cnibin\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539418 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539459 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6d8b2226-e518-487d-967a-78cbfd4da1dc-rootfs\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539482 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l52zs\" (UniqueName: \"kubernetes.io/projected/6d8b2226-e518-487d-967a-78cbfd4da1dc-kube-api-access-l52zs\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539499 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-var-lib-cni-bin\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539546 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-daemon-config\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539571 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-socket-dir-parent\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539606 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-var-lib-cni-multus\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539640 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-run-k8s-cni-cncf-io\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539685 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf6lc\" (UniqueName: \"kubernetes.io/projected/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-kube-api-access-vf6lc\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539713 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d8b2226-e518-487d-967a-78cbfd4da1dc-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.539732 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.540357 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.577405 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.590157 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.603600 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.617436 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.634704 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.640803 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6d8b2226-e518-487d-967a-78cbfd4da1dc-rootfs\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.640853 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l52zs\" (UniqueName: \"kubernetes.io/projected/6d8b2226-e518-487d-967a-78cbfd4da1dc-kube-api-access-l52zs\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.640880 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-var-lib-cni-bin\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.640914 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-daemon-config\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.640914 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6d8b2226-e518-487d-967a-78cbfd4da1dc-rootfs\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.640987 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-var-lib-cni-multus\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.640933 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-var-lib-cni-multus\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641027 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-var-lib-cni-bin\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641163 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-socket-dir-parent\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641262 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-run-k8s-cni-cncf-io\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641307 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf6lc\" (UniqueName: \"kubernetes.io/projected/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-kube-api-access-vf6lc\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641339 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d8b2226-e518-487d-967a-78cbfd4da1dc-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641356 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-run-k8s-cni-cncf-io\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641359 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641415 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-conf-dir\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641426 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-socket-dir-parent\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641436 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-system-cni-dir\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641458 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-system-cni-dir\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641484 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-cni-dir\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641512 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d8b2226-e518-487d-967a-78cbfd4da1dc-proxy-tls\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641532 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-cnibin\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641556 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-run-netns\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641572 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-var-lib-kubelet\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641486 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-conf-dir\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641594 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-cni-binary-copy\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641615 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-cni-dir\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641648 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-cnibin\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641671 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-system-cni-dir\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641706 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-var-lib-kubelet\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641755 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-etc-kubernetes\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641728 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-etc-kubernetes\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641841 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-os-release\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641873 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-hostroot\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641896 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-run-multus-certs\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641915 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqxq4\" (UniqueName: \"kubernetes.io/projected/8b52811a-aff2-43c1-9074-f0654f991d9c-kube-api-access-dqxq4\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641967 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-hostroot\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641979 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-run-multus-certs\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.641999 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-system-cni-dir\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642018 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-cnibin\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642128 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6d8b2226-e518-487d-967a-78cbfd4da1dc-mcd-auth-proxy-config\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642036 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-cnibin\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642167 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-os-release\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642199 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b52811a-aff2-43c1-9074-f0654f991d9c-cni-binary-copy\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642228 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642299 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-cni-binary-copy\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642373 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642379 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-os-release\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642383 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-os-release\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642448 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8b52811a-aff2-43c1-9074-f0654f991d9c-multus-daemon-config\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642473 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8b52811a-aff2-43c1-9074-f0654f991d9c-host-run-netns\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642824 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.642948 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8b52811a-aff2-43c1-9074-f0654f991d9c-cni-binary-copy\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.645831 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6d8b2226-e518-487d-967a-78cbfd4da1dc-proxy-tls\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.650801 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.659907 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l52zs\" (UniqueName: \"kubernetes.io/projected/6d8b2226-e518-487d-967a-78cbfd4da1dc-kube-api-access-l52zs\") pod \"machine-config-daemon-k7lhw\" (UID: \"6d8b2226-e518-487d-967a-78cbfd4da1dc\") " pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.660682 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqxq4\" (UniqueName: \"kubernetes.io/projected/8b52811a-aff2-43c1-9074-f0654f991d9c-kube-api-access-dqxq4\") pod \"multus-jrmtf\" (UID: \"8b52811a-aff2-43c1-9074-f0654f991d9c\") " pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.663836 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf6lc\" (UniqueName: \"kubernetes.io/projected/44bcbaec-1004-4feb-88ca-4fb1aeeb7c73-kube-api-access-vf6lc\") pod \"multus-additional-cni-plugins-qzmn2\" (UID: \"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\") " pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.667046 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.680446 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.695536 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.711319 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.729496 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.735761 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jrmtf" Dec 17 09:05:04 crc kubenswrapper[4935]: W1217 09:05:04.743091 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d8b2226_e518_487d_967a_78cbfd4da1dc.slice/crio-20863ef333330ed4487fdcf614653fdccaddb3ed966c7d51e85d4f31bd66516d WatchSource:0}: Error finding container 20863ef333330ed4487fdcf614653fdccaddb3ed966c7d51e85d4f31bd66516d: Status 404 returned error can't find the container with id 20863ef333330ed4487fdcf614653fdccaddb3ed966c7d51e85d4f31bd66516d Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.743315 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" Dec 17 09:05:04 crc kubenswrapper[4935]: W1217 09:05:04.763300 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b52811a_aff2_43c1_9074_f0654f991d9c.slice/crio-39c490ce0ceecd47971c46cd9fd90ff5e31c752bbb18b2493f1a26e7f46855bb WatchSource:0}: Error finding container 39c490ce0ceecd47971c46cd9fd90ff5e31c752bbb18b2493f1a26e7f46855bb: Status 404 returned error can't find the container with id 39c490ce0ceecd47971c46cd9fd90ff5e31c752bbb18b2493f1a26e7f46855bb Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.812678 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rwwd4"] Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.814162 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.817566 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.818664 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.819141 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.818798 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.819092 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.819339 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.819427 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.839205 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.855128 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.873330 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.891713 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.909105 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.929399 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.944869 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945419 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-config\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945466 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-var-lib-openvswitch\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945513 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-ovn\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945533 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-bin\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945550 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-netd\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945646 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-script-lib\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945691 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-ovn-kubernetes\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945708 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-openvswitch\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945762 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-node-log\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945788 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945851 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-env-overrides\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.945957 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovn-node-metrics-cert\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.946001 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-systemd\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.946031 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-etc-openvswitch\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.946104 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-systemd-units\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.946135 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-netns\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.946153 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-slash\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.946206 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ftrx\" (UniqueName: \"kubernetes.io/projected/969f53bb-09fc-4577-8f7c-dc6ca1679add-kube-api-access-8ftrx\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.946250 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-kubelet\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.946346 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-log-socket\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.959687 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.974239 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.982510 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.985501 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.989284 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:05:04 crc kubenswrapper[4935]: I1217 09:05:04.992621 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.000259 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:04Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.015869 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.030984 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047292 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-bin\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047546 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-bin\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047663 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-netd\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047781 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-script-lib\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047825 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-ovn-kubernetes\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047855 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-openvswitch\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047884 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-node-log\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047952 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047980 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-env-overrides\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047957 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-openvswitch\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048013 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-node-log\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048009 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovn-node-metrics-cert\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.047993 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048045 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-ovn-kubernetes\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048117 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-systemd\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048163 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-etc-openvswitch\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048187 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-systemd\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048195 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-systemd-units\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048255 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-etc-openvswitch\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048293 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-slash\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048317 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-slash\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048225 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-systemd-units\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048340 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-netns\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048368 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-netns\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048384 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ftrx\" (UniqueName: \"kubernetes.io/projected/969f53bb-09fc-4577-8f7c-dc6ca1679add-kube-api-access-8ftrx\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048436 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-kubelet\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048481 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-log-socket\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048529 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-config\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048554 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-var-lib-openvswitch\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048578 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-ovn\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048620 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-env-overrides\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048630 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-ovn\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048661 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-log-socket\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.048697 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-script-lib\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.049824 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-var-lib-openvswitch\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.049899 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.050106 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-netd\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.050457 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-config\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.050518 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-kubelet\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.053911 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovn-node-metrics-cert\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.065909 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ftrx\" (UniqueName: \"kubernetes.io/projected/969f53bb-09fc-4577-8f7c-dc6ca1679add-kube-api-access-8ftrx\") pod \"ovnkube-node-rwwd4\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.068447 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.083651 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.102224 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.115384 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.126661 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.131470 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: W1217 09:05:05.141058 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod969f53bb_09fc_4577_8f7c_dc6ca1679add.slice/crio-d587b0881dce447b95d2989ed7f67a3b0cb238463a31ed8b3830fc424d506094 WatchSource:0}: Error finding container d587b0881dce447b95d2989ed7f67a3b0cb238463a31ed8b3830fc424d506094: Status 404 returned error can't find the container with id d587b0881dce447b95d2989ed7f67a3b0cb238463a31ed8b3830fc424d506094 Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.147780 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.164561 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.176638 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.190203 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.202656 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.213780 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.227383 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.241872 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.312512 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f"} Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.312570 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"d587b0881dce447b95d2989ed7f67a3b0cb238463a31ed8b3830fc424d506094"} Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.316194 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jrmtf" event={"ID":"8b52811a-aff2-43c1-9074-f0654f991d9c","Type":"ContainerStarted","Data":"f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944"} Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.316257 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jrmtf" event={"ID":"8b52811a-aff2-43c1-9074-f0654f991d9c","Type":"ContainerStarted","Data":"39c490ce0ceecd47971c46cd9fd90ff5e31c752bbb18b2493f1a26e7f46855bb"} Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.320651 4935 generic.go:334] "Generic (PLEG): container finished" podID="44bcbaec-1004-4feb-88ca-4fb1aeeb7c73" containerID="ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b" exitCode=0 Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.320737 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" event={"ID":"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73","Type":"ContainerDied","Data":"ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b"} Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.320811 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" event={"ID":"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73","Type":"ContainerStarted","Data":"efa2cc997357e3fba68e3b76ef8d0fee4deb05404af7a9a25a879dbaf2617d2e"} Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.322930 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7"} Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.322967 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8"} Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.322982 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"20863ef333330ed4487fdcf614653fdccaddb3ed966c7d51e85d4f31bd66516d"} Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.326476 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e"} Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.329756 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.344811 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.362138 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.375731 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.391832 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.407207 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.422190 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.444769 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.485785 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.523041 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.564391 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.608162 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.643325 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.684951 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.723129 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.756336 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.756511 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:05:09.756486866 +0000 UTC m=+29.416327629 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.764395 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.805055 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.846631 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.857232 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.857299 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.857332 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.857361 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857424 4935 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857501 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857525 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857501 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857542 4935 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857551 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857527 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:09.85750344 +0000 UTC m=+29.517344283 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857583 4935 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857683 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:09.857621783 +0000 UTC m=+29.517462556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857723 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:09.857712235 +0000 UTC m=+29.517553128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857716 4935 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:05 crc kubenswrapper[4935]: E1217 09:05:05.857867 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:09.857800747 +0000 UTC m=+29.517641580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.885983 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.930764 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:05 crc kubenswrapper[4935]: I1217 09:05:05.969736 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:05Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.004044 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.045815 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.086423 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.123483 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.123525 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.123493 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:06 crc kubenswrapper[4935]: E1217 09:05:06.123604 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:06 crc kubenswrapper[4935]: E1217 09:05:06.123710 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:06 crc kubenswrapper[4935]: E1217 09:05:06.123789 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.126738 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.170737 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.203588 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.245058 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.332205 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" event={"ID":"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73","Type":"ContainerStarted","Data":"8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32"} Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.335620 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f" exitCode=0 Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.336113 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f"} Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.336152 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3"} Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.336170 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779"} Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.352496 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.374986 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.387560 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.400321 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.444165 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.483789 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.525721 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.569157 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.605695 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.645681 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.683893 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.731021 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.762542 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:06 crc kubenswrapper[4935]: I1217 09:05:06.805394 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:06Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.341125 4935 generic.go:334] "Generic (PLEG): container finished" podID="44bcbaec-1004-4feb-88ca-4fb1aeeb7c73" containerID="8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32" exitCode=0 Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.341222 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" event={"ID":"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73","Type":"ContainerDied","Data":"8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32"} Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.346080 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94"} Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.346144 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666"} Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.346165 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302"} Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.346186 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1"} Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.365456 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.390976 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.406996 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.424832 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.440592 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.456856 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.470440 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.484624 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.500211 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.515756 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.530353 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.545146 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.560681 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:07 crc kubenswrapper[4935]: I1217 09:05:07.574910 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:07Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.123471 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:08 crc kubenswrapper[4935]: E1217 09:05:08.123611 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.123814 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.123908 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:08 crc kubenswrapper[4935]: E1217 09:05:08.124121 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:08 crc kubenswrapper[4935]: E1217 09:05:08.124427 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.198404 4935 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.200793 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.200817 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.200826 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.200928 4935 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.212339 4935 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.212624 4935 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.213666 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.213702 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.213713 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.213732 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.213743 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: E1217 09:05:08.227205 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.231840 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.231870 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.231878 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.231893 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.231903 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: E1217 09:05:08.246057 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.253033 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.253096 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.253111 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.253134 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.253149 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: E1217 09:05:08.266419 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.271699 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.271758 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.271772 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.271793 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.271808 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: E1217 09:05:08.285979 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.290869 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.290923 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.290936 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.290953 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.290963 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: E1217 09:05:08.305363 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: E1217 09:05:08.305573 4935 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.307685 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.307728 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.307742 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.307760 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.307772 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.352446 4935 generic.go:334] "Generic (PLEG): container finished" podID="44bcbaec-1004-4feb-88ca-4fb1aeeb7c73" containerID="f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78" exitCode=0 Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.352518 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" event={"ID":"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73","Type":"ContainerDied","Data":"f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78"} Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.372765 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.390200 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.410792 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.410832 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.410840 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.410853 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.410863 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.413316 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.429838 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.443995 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.454977 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.466079 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.478941 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.494181 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.506294 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.513661 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.513709 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.513718 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.513734 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.513744 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.517904 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.544380 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.574613 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.588360 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:08Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.616536 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.616579 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.616588 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.616603 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.616614 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.719522 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.719591 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.719616 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.719644 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.719663 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.822868 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.822949 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.822968 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.822996 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.823029 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.925200 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.925243 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.925254 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.925290 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:08 crc kubenswrapper[4935]: I1217 09:05:08.925303 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:08Z","lastTransitionTime":"2025-12-17T09:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.027687 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.027992 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.028076 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.028181 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.028260 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:09Z","lastTransitionTime":"2025-12-17T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.130873 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.130932 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.130944 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.130962 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.130972 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:09Z","lastTransitionTime":"2025-12-17T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.233076 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.233115 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.233126 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.233143 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.233154 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:09Z","lastTransitionTime":"2025-12-17T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.337671 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.337745 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.337764 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.337789 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.337807 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:09Z","lastTransitionTime":"2025-12-17T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.361560 4935 generic.go:334] "Generic (PLEG): container finished" podID="44bcbaec-1004-4feb-88ca-4fb1aeeb7c73" containerID="77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec" exitCode=0 Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.361663 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" event={"ID":"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73","Type":"ContainerDied","Data":"77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.385210 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.421378 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.444190 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.444231 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.444241 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.444257 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.444287 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:09Z","lastTransitionTime":"2025-12-17T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.444323 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.465070 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.487867 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.502580 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.515197 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.527591 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.543854 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.546440 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.546484 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.546500 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.546523 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.546538 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:09Z","lastTransitionTime":"2025-12-17T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.558084 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.571216 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.587718 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.610970 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.625181 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:09Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.648651 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.648711 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.648725 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.648752 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.648767 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:09Z","lastTransitionTime":"2025-12-17T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.751699 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.751784 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.751797 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.751817 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.751830 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:09Z","lastTransitionTime":"2025-12-17T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.802644 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.802880 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:05:17.802833385 +0000 UTC m=+37.462674198 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.854975 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.855038 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.855057 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.855082 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.855098 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:09Z","lastTransitionTime":"2025-12-17T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.903491 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.903539 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.903560 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.903586 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.903696 4935 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.903752 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:17.903734396 +0000 UTC m=+37.563575149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.903892 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.903945 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.903965 4935 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.904069 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:17.904035383 +0000 UTC m=+37.563876316 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.904065 4935 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.904189 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:17.904163576 +0000 UTC m=+37.564004529 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.904086 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.904242 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.904258 4935 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:09 crc kubenswrapper[4935]: E1217 09:05:09.904327 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:17.90431608 +0000 UTC m=+37.564157053 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.958143 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.958194 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.958207 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.958224 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:09 crc kubenswrapper[4935]: I1217 09:05:09.958235 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:09Z","lastTransitionTime":"2025-12-17T09:05:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.061494 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.061538 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.061548 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.061563 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.061575 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:10Z","lastTransitionTime":"2025-12-17T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.123781 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.123816 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.123795 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:10 crc kubenswrapper[4935]: E1217 09:05:10.123950 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:10 crc kubenswrapper[4935]: E1217 09:05:10.124052 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:10 crc kubenswrapper[4935]: E1217 09:05:10.124137 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.163587 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.163624 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.163635 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.163649 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.163659 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:10Z","lastTransitionTime":"2025-12-17T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.265933 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.265973 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.265984 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.266000 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.266011 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:10Z","lastTransitionTime":"2025-12-17T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.367477 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.367519 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.367530 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.367545 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.367557 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:10Z","lastTransitionTime":"2025-12-17T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.368452 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.370366 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" event={"ID":"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73","Type":"ContainerStarted","Data":"dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.384790 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.395503 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.407057 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.424383 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.438198 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.454893 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.470823 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.470868 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.470878 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.470894 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.470905 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:10Z","lastTransitionTime":"2025-12-17T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.472531 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.488309 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.501560 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.514159 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.527231 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.540070 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.555107 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.573211 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.573265 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.573295 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.573317 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.573331 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:10Z","lastTransitionTime":"2025-12-17T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.574596 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:10Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.676444 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.676499 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.676510 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.676528 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.676541 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:10Z","lastTransitionTime":"2025-12-17T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.781185 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.781264 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.781308 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.781340 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.781358 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:10Z","lastTransitionTime":"2025-12-17T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.883690 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.883736 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.883744 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.883759 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.883770 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:10Z","lastTransitionTime":"2025-12-17T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.986763 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.986834 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.986852 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.986879 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:10 crc kubenswrapper[4935]: I1217 09:05:10.986898 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:10Z","lastTransitionTime":"2025-12-17T09:05:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.089923 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.089992 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.090006 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.090028 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.090042 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:11Z","lastTransitionTime":"2025-12-17T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.137165 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.150023 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.165070 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.179916 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.192506 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.192537 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.192546 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.192562 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.192573 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:11Z","lastTransitionTime":"2025-12-17T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.208761 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.254871 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.268219 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.281634 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.295245 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.295359 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.295380 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.295405 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.295423 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:11Z","lastTransitionTime":"2025-12-17T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.299803 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.312847 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.328262 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.357209 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.370058 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.375939 4935 generic.go:334] "Generic (PLEG): container finished" podID="44bcbaec-1004-4feb-88ca-4fb1aeeb7c73" containerID="dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04" exitCode=0 Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.375994 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" event={"ID":"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73","Type":"ContainerDied","Data":"dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04"} Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.391612 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.401748 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.401804 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.401820 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.401838 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.401850 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:11Z","lastTransitionTime":"2025-12-17T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.405336 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.423628 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.435524 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.451319 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.466441 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.480059 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.500237 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.503822 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.503852 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.503861 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.503876 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.503888 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:11Z","lastTransitionTime":"2025-12-17T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.511775 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.522516 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.536817 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.549103 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.562824 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.577428 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.587003 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:11Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.606949 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.606990 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.607001 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.607017 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.607028 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:11Z","lastTransitionTime":"2025-12-17T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.709406 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.709450 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.709465 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.709484 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.709496 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:11Z","lastTransitionTime":"2025-12-17T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.812686 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.813050 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.813059 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.813076 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.813089 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:11Z","lastTransitionTime":"2025-12-17T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.916153 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.916193 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.916203 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.916219 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:11 crc kubenswrapper[4935]: I1217 09:05:11.916230 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:11Z","lastTransitionTime":"2025-12-17T09:05:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.018796 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.018878 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.018890 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.018904 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.018915 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:12Z","lastTransitionTime":"2025-12-17T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.121731 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.121790 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.121812 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.121829 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.121839 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:12Z","lastTransitionTime":"2025-12-17T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.123725 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:12 crc kubenswrapper[4935]: E1217 09:05:12.123824 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.123883 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.124042 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:12 crc kubenswrapper[4935]: E1217 09:05:12.124266 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:12 crc kubenswrapper[4935]: E1217 09:05:12.124523 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.224075 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.224116 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.224127 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.224148 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.224159 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:12Z","lastTransitionTime":"2025-12-17T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.326843 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.327146 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.327218 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.327303 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.327367 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:12Z","lastTransitionTime":"2025-12-17T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.384712 4935 generic.go:334] "Generic (PLEG): container finished" podID="44bcbaec-1004-4feb-88ca-4fb1aeeb7c73" containerID="865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff" exitCode=0 Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.384788 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" event={"ID":"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73","Type":"ContainerDied","Data":"865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.391605 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.395649 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.395720 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.411777 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.421019 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.424573 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.424659 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.431057 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.431090 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.431098 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.431112 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.431122 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:12Z","lastTransitionTime":"2025-12-17T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.439891 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.458464 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.473714 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.489933 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.502524 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.513437 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.528144 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.534299 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.534573 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.534583 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.534599 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.534610 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:12Z","lastTransitionTime":"2025-12-17T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.541138 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.554187 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.572254 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.585332 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.601188 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.615723 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.627226 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.636793 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.636856 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.636869 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.636888 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.636902 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:12Z","lastTransitionTime":"2025-12-17T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.638612 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.655878 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.670402 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.686063 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.699179 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.711387 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.722124 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.733266 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.739169 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.739200 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.739208 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.739223 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.739233 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:12Z","lastTransitionTime":"2025-12-17T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.747416 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.760522 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.774110 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.784800 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:12Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.842138 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.842174 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.842183 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.842197 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.842206 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:12Z","lastTransitionTime":"2025-12-17T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.944144 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.944192 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.944199 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.944217 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:12 crc kubenswrapper[4935]: I1217 09:05:12.944226 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:12Z","lastTransitionTime":"2025-12-17T09:05:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.046522 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.046557 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.046569 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.046585 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.046599 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:13Z","lastTransitionTime":"2025-12-17T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.149577 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.149632 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.149644 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.149662 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.149674 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:13Z","lastTransitionTime":"2025-12-17T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.255261 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.255399 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.255429 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.255473 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.255500 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:13Z","lastTransitionTime":"2025-12-17T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.359175 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.359235 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.359247 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.359287 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.359301 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:13Z","lastTransitionTime":"2025-12-17T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.403420 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" event={"ID":"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73","Type":"ContainerStarted","Data":"ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.403641 4935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.419371 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.441684 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.456884 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.466080 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.466150 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.466171 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.466199 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.466218 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:13Z","lastTransitionTime":"2025-12-17T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.473890 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.490859 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.506232 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.518785 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.533686 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.547394 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.559761 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.569085 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.569121 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.569130 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.569146 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.569157 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:13Z","lastTransitionTime":"2025-12-17T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.573396 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.588379 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.604530 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.616507 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:13Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.672443 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.672497 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.672513 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.672538 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.672556 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:13Z","lastTransitionTime":"2025-12-17T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.775708 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.775765 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.775776 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.775794 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.775807 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:13Z","lastTransitionTime":"2025-12-17T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.878659 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.878726 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.878741 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.878758 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.878768 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:13Z","lastTransitionTime":"2025-12-17T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.981195 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.981245 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.981255 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.981292 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:13 crc kubenswrapper[4935]: I1217 09:05:13.981305 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:13Z","lastTransitionTime":"2025-12-17T09:05:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.084096 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.084150 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.084164 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.084182 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.084192 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:14Z","lastTransitionTime":"2025-12-17T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.123575 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.123633 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:14 crc kubenswrapper[4935]: E1217 09:05:14.123746 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:14 crc kubenswrapper[4935]: E1217 09:05:14.123930 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.124041 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:14 crc kubenswrapper[4935]: E1217 09:05:14.124230 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.187132 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.187423 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.187494 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.187567 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.187625 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:14Z","lastTransitionTime":"2025-12-17T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.290135 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.290177 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.290187 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.290202 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.290214 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:14Z","lastTransitionTime":"2025-12-17T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.393349 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.393413 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.393429 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.393453 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.393467 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:14Z","lastTransitionTime":"2025-12-17T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.405742 4935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.495980 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.496021 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.496032 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.496048 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.496058 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:14Z","lastTransitionTime":"2025-12-17T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.599470 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.599574 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.599607 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.599647 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.599674 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:14Z","lastTransitionTime":"2025-12-17T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.702514 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.702562 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.702573 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.702591 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.702605 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:14Z","lastTransitionTime":"2025-12-17T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.805187 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.805225 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.805234 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.805249 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.805260 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:14Z","lastTransitionTime":"2025-12-17T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.909191 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.909247 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.909256 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.909295 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:14 crc kubenswrapper[4935]: I1217 09:05:14.909306 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:14Z","lastTransitionTime":"2025-12-17T09:05:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.012984 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.013066 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.013117 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.013163 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.013191 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:15Z","lastTransitionTime":"2025-12-17T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.115952 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.116001 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.116009 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.116025 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.116039 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:15Z","lastTransitionTime":"2025-12-17T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.219856 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.219925 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.219941 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.219970 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.219999 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:15Z","lastTransitionTime":"2025-12-17T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.323511 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.323558 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.323569 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.323596 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.323610 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:15Z","lastTransitionTime":"2025-12-17T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.414529 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/0.log" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.418584 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e" exitCode=1 Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.418668 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.420087 4935 scope.go:117] "RemoveContainer" containerID="b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.427615 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.427674 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.427695 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.427725 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.427748 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:15Z","lastTransitionTime":"2025-12-17T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.445628 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.463685 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.479512 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.506209 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:14Z\\\",\\\"message\\\":\\\" 6220 handler.go:208] Removed *v1.Node event handler 2\\\\nI1217 09:05:14.792050 6220 handler.go:208] Removed *v1.Node event handler 7\\\\nI1217 09:05:14.792148 6220 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.792462 6220 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792572 6220 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792840 6220 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1217 09:05:14.793089 6220 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.793260 6220 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.793444 6220 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.520419 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.530075 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.530124 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.530137 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.530159 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.530175 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:15Z","lastTransitionTime":"2025-12-17T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.541874 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.555231 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.568874 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.582327 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.595002 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.605360 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.617360 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.629481 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.633464 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.633504 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.633521 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.633539 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.633551 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:15Z","lastTransitionTime":"2025-12-17T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.645634 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:15Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.735936 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.735980 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.735993 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.736008 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.736020 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:15Z","lastTransitionTime":"2025-12-17T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.841551 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.841621 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.841641 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.841668 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.842394 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:15Z","lastTransitionTime":"2025-12-17T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.946036 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.946134 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.946153 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.946179 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:15 crc kubenswrapper[4935]: I1217 09:05:15.946196 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:15Z","lastTransitionTime":"2025-12-17T09:05:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.048367 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.048401 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.048413 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.048430 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.048442 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:16Z","lastTransitionTime":"2025-12-17T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.123572 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.123637 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:16 crc kubenswrapper[4935]: E1217 09:05:16.123699 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:16 crc kubenswrapper[4935]: E1217 09:05:16.123766 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.123647 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:16 crc kubenswrapper[4935]: E1217 09:05:16.123980 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.150980 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.151021 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.151029 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.151045 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.151056 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:16Z","lastTransitionTime":"2025-12-17T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.253986 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.254039 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.254053 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.254073 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.254085 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:16Z","lastTransitionTime":"2025-12-17T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.357156 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.357222 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.357235 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.357251 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.357262 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:16Z","lastTransitionTime":"2025-12-17T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.423874 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/0.log" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.428578 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.428803 4935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.449205 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.460440 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.460490 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.460502 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.460519 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.460531 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:16Z","lastTransitionTime":"2025-12-17T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.466510 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.481792 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.493673 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.506369 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.521855 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.540411 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.557064 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.562992 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.563032 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.563046 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.563106 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.563121 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:16Z","lastTransitionTime":"2025-12-17T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.572555 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.586380 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.604209 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.629367 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:14Z\\\",\\\"message\\\":\\\" 6220 handler.go:208] Removed *v1.Node event handler 2\\\\nI1217 09:05:14.792050 6220 handler.go:208] Removed *v1.Node event handler 7\\\\nI1217 09:05:14.792148 6220 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.792462 6220 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792572 6220 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792840 6220 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1217 09:05:14.793089 6220 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.793260 6220 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.793444 6220 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.641516 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm"] Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.642084 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: W1217 09:05:16.644925 4935 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert": failed to list *v1.Secret: secrets "ovn-control-plane-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 17 09:05:16 crc kubenswrapper[4935]: E1217 09:05:16.644978 4935 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-control-plane-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-control-plane-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 17 09:05:16 crc kubenswrapper[4935]: W1217 09:05:16.645038 4935 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd": failed to list *v1.Secret: secrets "ovn-kubernetes-control-plane-dockercfg-gs7dd" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Dec 17 09:05:16 crc kubenswrapper[4935]: E1217 09:05:16.645054 4935 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-control-plane-dockercfg-gs7dd\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-kubernetes-control-plane-dockercfg-gs7dd\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.653312 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.665852 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.665898 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.665908 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.665923 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.665934 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:16Z","lastTransitionTime":"2025-12-17T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.679251 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.705737 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.727590 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.744554 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.762016 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.768896 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.768955 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.768968 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.768992 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.769005 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:16Z","lastTransitionTime":"2025-12-17T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.776932 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.777542 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftd9l\" (UniqueName: \"kubernetes.io/projected/8a1fb087-2513-44cc-8dfd-e9879b0e840c-kube-api-access-ftd9l\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.777659 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a1fb087-2513-44cc-8dfd-e9879b0e840c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.777730 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a1fb087-2513-44cc-8dfd-e9879b0e840c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.777750 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a1fb087-2513-44cc-8dfd-e9879b0e840c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.792712 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.813175 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.825083 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.835814 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.847736 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.862030 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.871545 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.871618 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.871631 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.871650 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.871663 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:16Z","lastTransitionTime":"2025-12-17T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.878092 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.878628 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a1fb087-2513-44cc-8dfd-e9879b0e840c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.878728 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a1fb087-2513-44cc-8dfd-e9879b0e840c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.878759 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a1fb087-2513-44cc-8dfd-e9879b0e840c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.879495 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a1fb087-2513-44cc-8dfd-e9879b0e840c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.879564 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a1fb087-2513-44cc-8dfd-e9879b0e840c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.879651 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftd9l\" (UniqueName: \"kubernetes.io/projected/8a1fb087-2513-44cc-8dfd-e9879b0e840c-kube-api-access-ftd9l\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.892668 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.896954 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftd9l\" (UniqueName: \"kubernetes.io/projected/8a1fb087-2513-44cc-8dfd-e9879b0e840c-kube-api-access-ftd9l\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.912852 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:14Z\\\",\\\"message\\\":\\\" 6220 handler.go:208] Removed *v1.Node event handler 2\\\\nI1217 09:05:14.792050 6220 handler.go:208] Removed *v1.Node event handler 7\\\\nI1217 09:05:14.792148 6220 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.792462 6220 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792572 6220 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792840 6220 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1217 09:05:14.793089 6220 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.793260 6220 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.793444 6220 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.932030 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:16Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.973674 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.973718 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.973732 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.973758 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:16 crc kubenswrapper[4935]: I1217 09:05:16.973771 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:16Z","lastTransitionTime":"2025-12-17T09:05:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.076380 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.076662 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.076739 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.076807 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.076875 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:17Z","lastTransitionTime":"2025-12-17T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.179100 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.179132 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.179141 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.179156 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.179166 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:17Z","lastTransitionTime":"2025-12-17T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.282486 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.282533 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.282547 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.282565 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.282579 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:17Z","lastTransitionTime":"2025-12-17T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.385143 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.385193 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.385207 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.385226 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.385238 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:17Z","lastTransitionTime":"2025-12-17T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.487100 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.487140 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.487149 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.487164 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.487173 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:17Z","lastTransitionTime":"2025-12-17T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.589461 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.589551 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.589566 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.589585 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.589599 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:17Z","lastTransitionTime":"2025-12-17T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.692059 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.692124 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.692143 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.692173 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.692205 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:17Z","lastTransitionTime":"2025-12-17T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.743320 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-rg2z5"] Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.743793 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.743851 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.760344 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.771793 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.788841 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.794211 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.794250 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.794265 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.794301 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.794315 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:17Z","lastTransitionTime":"2025-12-17T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.814816 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:14Z\\\",\\\"message\\\":\\\" 6220 handler.go:208] Removed *v1.Node event handler 2\\\\nI1217 09:05:14.792050 6220 handler.go:208] Removed *v1.Node event handler 7\\\\nI1217 09:05:14.792148 6220 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.792462 6220 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792572 6220 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792840 6220 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1217 09:05:14.793089 6220 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.793260 6220 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.793444 6220 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.831060 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.844892 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.857578 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.867795 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.880392 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.880447 4935 secret.go:188] Couldn't get secret openshift-ovn-kubernetes/ovn-control-plane-metrics-cert: failed to sync secret cache: timed out waiting for the condition Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.880550 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a1fb087-2513-44cc-8dfd-e9879b0e840c-ovn-control-plane-metrics-cert podName:8a1fb087-2513-44cc-8dfd-e9879b0e840c nodeName:}" failed. No retries permitted until 2025-12-17 09:05:18.380524775 +0000 UTC m=+38.040365538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "ovn-control-plane-metrics-cert" (UniqueName: "kubernetes.io/secret/8a1fb087-2513-44cc-8dfd-e9879b0e840c-ovn-control-plane-metrics-cert") pod "ovnkube-control-plane-749d76644c-sh5rm" (UID: "8a1fb087-2513-44cc-8dfd-e9879b0e840c") : failed to sync secret cache: timed out waiting for the condition Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.890333 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.890476 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhr4p\" (UniqueName: \"kubernetes.io/projected/77feddc8-547a-42a0-baa3-19dd2915eb9f-kube-api-access-dhr4p\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.890503 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:05:33.890481113 +0000 UTC m=+53.550321876 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.890566 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.895340 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.897734 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.897804 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.897827 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.897859 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.897884 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:17Z","lastTransitionTime":"2025-12-17T09:05:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.906746 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.918107 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.942876 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.956398 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.977534 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.991144 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:17Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.991239 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.991268 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhr4p\" (UniqueName: \"kubernetes.io/projected/77feddc8-547a-42a0-baa3-19dd2915eb9f-kube-api-access-dhr4p\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.991319 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.991342 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.991366 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:17 crc kubenswrapper[4935]: I1217 09:05:17.991399 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991343 4935 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991467 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991480 4935 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991484 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991500 4935 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991531 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs podName:77feddc8-547a-42a0-baa3-19dd2915eb9f nodeName:}" failed. No retries permitted until 2025-12-17 09:05:18.491513448 +0000 UTC m=+38.151354211 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs") pod "network-metrics-daemon-rg2z5" (UID: "77feddc8-547a-42a0-baa3-19dd2915eb9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991547 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:33.991539229 +0000 UTC m=+53.651379992 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991548 4935 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991424 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991625 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:33.991566679 +0000 UTC m=+53.651407482 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991681 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:33.991661782 +0000 UTC m=+53.651502585 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991680 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991721 4935 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:17 crc kubenswrapper[4935]: E1217 09:05:17.991847 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-17 09:05:33.991815585 +0000 UTC m=+53.651656348 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.000600 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.000638 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.000647 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.000664 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.000676 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.009328 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhr4p\" (UniqueName: \"kubernetes.io/projected/77feddc8-547a-42a0-baa3-19dd2915eb9f-kube-api-access-dhr4p\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.009695 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.103228 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.103350 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.103364 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.103379 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.103390 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.123865 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.123989 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.124002 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.124230 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.124441 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.124667 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.133026 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.205721 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.205804 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.205829 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.205864 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.205887 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.308353 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.308418 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.308433 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.308462 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.308481 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.394526 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a1fb087-2513-44cc-8dfd-e9879b0e840c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.397712 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a1fb087-2513-44cc-8dfd-e9879b0e840c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sh5rm\" (UID: \"8a1fb087-2513-44cc-8dfd-e9879b0e840c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.410976 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.411012 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.411022 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.411039 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.411049 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.454982 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.494456 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.494502 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.494515 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.494532 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.494542 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.495059 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.495245 4935 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.495348 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs podName:77feddc8-547a-42a0-baa3-19dd2915eb9f nodeName:}" failed. No retries permitted until 2025-12-17 09:05:19.495329987 +0000 UTC m=+39.155170750 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs") pod "network-metrics-daemon-rg2z5" (UID: "77feddc8-547a-42a0-baa3-19dd2915eb9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.512299 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:18Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.516982 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.517038 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.517057 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.517081 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.517098 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.532818 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:18Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.536677 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.536717 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.536726 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.536843 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.536855 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.554658 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:18Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.557824 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.557862 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.557875 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.557892 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.557906 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.575761 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:18Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.581336 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.581375 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.581390 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.581406 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.581417 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.594053 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:18Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:18 crc kubenswrapper[4935]: E1217 09:05:18.594224 4935 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.595690 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.595733 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.595746 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.595763 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.595774 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.698759 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.698823 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.698841 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.698865 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.698882 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.801718 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.801785 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.801808 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.801839 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.801862 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.905229 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.905352 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.905375 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.905399 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:18 crc kubenswrapper[4935]: I1217 09:05:18.905418 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:18Z","lastTransitionTime":"2025-12-17T09:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.007394 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.007477 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.007501 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.007533 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.007550 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:19Z","lastTransitionTime":"2025-12-17T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.111641 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.111706 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.111728 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.111752 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.111769 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:19Z","lastTransitionTime":"2025-12-17T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.123393 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:19 crc kubenswrapper[4935]: E1217 09:05:19.123664 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.214705 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.214775 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.214793 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.214815 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.214831 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:19Z","lastTransitionTime":"2025-12-17T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.317498 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.317543 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.317556 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.317575 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.317589 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:19Z","lastTransitionTime":"2025-12-17T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.420420 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.420491 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.420502 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.420519 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.420530 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:19Z","lastTransitionTime":"2025-12-17T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.440515 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/1.log" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.441070 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/0.log" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.443555 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a" exitCode=1 Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.443623 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.443689 4935 scope.go:117] "RemoveContainer" containerID="b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.444648 4935 scope.go:117] "RemoveContainer" containerID="d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a" Dec 17 09:05:19 crc kubenswrapper[4935]: E1217 09:05:19.444850 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.445494 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" event={"ID":"8a1fb087-2513-44cc-8dfd-e9879b0e840c","Type":"ContainerStarted","Data":"7d54478477ba08307d2c1513e8ea17c54a51ef369799cfc4300b635512a1f759"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.456264 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.472986 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.487310 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.503935 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.504214 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:19 crc kubenswrapper[4935]: E1217 09:05:19.504438 4935 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:19 crc kubenswrapper[4935]: E1217 09:05:19.504542 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs podName:77feddc8-547a-42a0-baa3-19dd2915eb9f nodeName:}" failed. No retries permitted until 2025-12-17 09:05:21.504518074 +0000 UTC m=+41.164359027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs") pod "network-metrics-daemon-rg2z5" (UID: "77feddc8-547a-42a0-baa3-19dd2915eb9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.522483 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.523287 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.523334 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.523345 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.523362 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.523375 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:19Z","lastTransitionTime":"2025-12-17T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.541151 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.562612 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.579256 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.593726 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.607616 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.625157 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.627327 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.627385 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.627398 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.627426 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.627439 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:19Z","lastTransitionTime":"2025-12-17T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.643359 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.659975 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.677381 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.701428 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:14Z\\\",\\\"message\\\":\\\" 6220 handler.go:208] Removed *v1.Node event handler 2\\\\nI1217 09:05:14.792050 6220 handler.go:208] Removed *v1.Node event handler 7\\\\nI1217 09:05:14.792148 6220 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.792462 6220 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792572 6220 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792840 6220 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1217 09:05:14.793089 6220 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.793260 6220 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.793444 6220 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:19Z\\\",\\\"message\\\":\\\":17.223558 6366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223544 6366 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1217 09:05:17.223574 6366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223583 6366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-qzmn2 in node crc\\\\nI1217 09:05:17.223590 6366 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2 after 0 failed attempt(s)\\\\nI1217 09:05:17.223595 6366 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-pl\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.719163 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:19Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.729577 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.729605 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.729615 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.729630 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.729640 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:19Z","lastTransitionTime":"2025-12-17T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.838980 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.839025 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.839037 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.839057 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.839070 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:19Z","lastTransitionTime":"2025-12-17T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.942619 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.942673 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.942686 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.942708 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:19 crc kubenswrapper[4935]: I1217 09:05:19.942722 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:19Z","lastTransitionTime":"2025-12-17T09:05:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.045593 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.045668 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.045687 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.045721 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.045742 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:20Z","lastTransitionTime":"2025-12-17T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.124034 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.124041 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.124069 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:20 crc kubenswrapper[4935]: E1217 09:05:20.124256 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:20 crc kubenswrapper[4935]: E1217 09:05:20.124770 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:20 crc kubenswrapper[4935]: E1217 09:05:20.124487 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.148434 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.148540 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.148578 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.148611 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.148632 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:20Z","lastTransitionTime":"2025-12-17T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.252216 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.252268 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.252313 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.252336 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.252349 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:20Z","lastTransitionTime":"2025-12-17T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.361024 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.361125 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.361147 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.361195 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.361208 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:20Z","lastTransitionTime":"2025-12-17T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.452499 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" event={"ID":"8a1fb087-2513-44cc-8dfd-e9879b0e840c","Type":"ContainerStarted","Data":"b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.452562 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" event={"ID":"8a1fb087-2513-44cc-8dfd-e9879b0e840c","Type":"ContainerStarted","Data":"ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.454970 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/1.log" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.463709 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.463783 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.463807 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.463832 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.463851 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:20Z","lastTransitionTime":"2025-12-17T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.474175 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.494836 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:14Z\\\",\\\"message\\\":\\\" 6220 handler.go:208] Removed *v1.Node event handler 2\\\\nI1217 09:05:14.792050 6220 handler.go:208] Removed *v1.Node event handler 7\\\\nI1217 09:05:14.792148 6220 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.792462 6220 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792572 6220 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792840 6220 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1217 09:05:14.793089 6220 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.793260 6220 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.793444 6220 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:19Z\\\",\\\"message\\\":\\\":17.223558 6366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223544 6366 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1217 09:05:17.223574 6366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223583 6366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-qzmn2 in node crc\\\\nI1217 09:05:17.223590 6366 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2 after 0 failed attempt(s)\\\\nI1217 09:05:17.223595 6366 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-pl\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.511306 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.525558 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.539638 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.552233 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.567208 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.567257 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.567294 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.567315 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.567330 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:20Z","lastTransitionTime":"2025-12-17T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.567742 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.583323 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.598003 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.614587 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.632700 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.644361 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.656296 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.669875 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.669926 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.669940 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.669967 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.669982 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:20Z","lastTransitionTime":"2025-12-17T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.673319 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.689468 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.710341 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:20Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.772672 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.772705 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.772716 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.772738 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.772752 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:20Z","lastTransitionTime":"2025-12-17T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.875468 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.875516 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.875530 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.875549 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.875563 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:20Z","lastTransitionTime":"2025-12-17T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.978547 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.978593 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.978605 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.978624 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:20 crc kubenswrapper[4935]: I1217 09:05:20.978636 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:20Z","lastTransitionTime":"2025-12-17T09:05:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.081176 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.081232 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.081313 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.081340 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.081357 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:21Z","lastTransitionTime":"2025-12-17T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.123223 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:21 crc kubenswrapper[4935]: E1217 09:05:21.123443 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.141251 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.155874 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.169976 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.184639 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.184695 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.184712 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.184735 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.184757 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:21Z","lastTransitionTime":"2025-12-17T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.194943 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:14Z\\\",\\\"message\\\":\\\" 6220 handler.go:208] Removed *v1.Node event handler 2\\\\nI1217 09:05:14.792050 6220 handler.go:208] Removed *v1.Node event handler 7\\\\nI1217 09:05:14.792148 6220 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.792462 6220 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792572 6220 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792840 6220 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1217 09:05:14.793089 6220 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.793260 6220 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.793444 6220 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:19Z\\\",\\\"message\\\":\\\":17.223558 6366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223544 6366 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1217 09:05:17.223574 6366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223583 6366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-qzmn2 in node crc\\\\nI1217 09:05:17.223590 6366 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2 after 0 failed attempt(s)\\\\nI1217 09:05:17.223595 6366 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-pl\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.209857 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.227542 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.252459 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.266771 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.277605 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.286973 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.287000 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.287009 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.287024 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.287035 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:21Z","lastTransitionTime":"2025-12-17T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.292140 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.306727 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.321480 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.339108 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.353059 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.364564 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.374160 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:21Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.388995 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.389038 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.389052 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.389069 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.389080 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:21Z","lastTransitionTime":"2025-12-17T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.491533 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.491619 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.491645 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.491682 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.491709 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:21Z","lastTransitionTime":"2025-12-17T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.527951 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:21 crc kubenswrapper[4935]: E1217 09:05:21.528131 4935 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:21 crc kubenswrapper[4935]: E1217 09:05:21.528197 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs podName:77feddc8-547a-42a0-baa3-19dd2915eb9f nodeName:}" failed. No retries permitted until 2025-12-17 09:05:25.52817888 +0000 UTC m=+45.188019653 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs") pod "network-metrics-daemon-rg2z5" (UID: "77feddc8-547a-42a0-baa3-19dd2915eb9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.594743 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.594781 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.594790 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.594808 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.594818 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:21Z","lastTransitionTime":"2025-12-17T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.698748 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.698804 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.698884 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.698929 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.698944 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:21Z","lastTransitionTime":"2025-12-17T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.802587 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.802676 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.802701 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.802737 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.802762 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:21Z","lastTransitionTime":"2025-12-17T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.905353 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.905410 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.905422 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.905437 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:21 crc kubenswrapper[4935]: I1217 09:05:21.905446 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:21Z","lastTransitionTime":"2025-12-17T09:05:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.008474 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.008519 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.008534 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.008550 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.008561 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:22Z","lastTransitionTime":"2025-12-17T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.112171 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.112214 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.112225 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.112246 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.112258 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:22Z","lastTransitionTime":"2025-12-17T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.123842 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.123998 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.124105 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:22 crc kubenswrapper[4935]: E1217 09:05:22.124105 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:22 crc kubenswrapper[4935]: E1217 09:05:22.124216 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:22 crc kubenswrapper[4935]: E1217 09:05:22.124430 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.215643 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.215696 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.215709 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.215729 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.215744 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:22Z","lastTransitionTime":"2025-12-17T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.319032 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.319092 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.319105 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.319128 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.319143 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:22Z","lastTransitionTime":"2025-12-17T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.423967 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.424034 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.424051 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.424074 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.424099 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:22Z","lastTransitionTime":"2025-12-17T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.527804 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.527854 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.527864 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.527884 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.527895 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:22Z","lastTransitionTime":"2025-12-17T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.630938 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.630998 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.631014 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.631037 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.631053 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:22Z","lastTransitionTime":"2025-12-17T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.735184 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.735257 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.735315 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.735349 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.735373 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:22Z","lastTransitionTime":"2025-12-17T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.839845 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.839936 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.839957 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.839986 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.840004 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:22Z","lastTransitionTime":"2025-12-17T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.942977 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.943564 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.943736 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.943899 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:22 crc kubenswrapper[4935]: I1217 09:05:22.944044 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:22Z","lastTransitionTime":"2025-12-17T09:05:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.047526 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.047601 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.047621 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.047649 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.047668 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:23Z","lastTransitionTime":"2025-12-17T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.123404 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:23 crc kubenswrapper[4935]: E1217 09:05:23.123601 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.150749 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.150829 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.150863 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.150898 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.150922 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:23Z","lastTransitionTime":"2025-12-17T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.254237 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.254352 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.254367 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.254393 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.254408 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:23Z","lastTransitionTime":"2025-12-17T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.357598 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.357696 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.357748 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.357789 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.357822 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:23Z","lastTransitionTime":"2025-12-17T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.462055 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.462125 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.462146 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.462178 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.462198 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:23Z","lastTransitionTime":"2025-12-17T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.566595 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.566676 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.566702 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.566733 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.566752 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:23Z","lastTransitionTime":"2025-12-17T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.669949 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.670030 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.670083 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.670114 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.670133 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:23Z","lastTransitionTime":"2025-12-17T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.774695 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.774770 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.774793 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.774823 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.774846 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:23Z","lastTransitionTime":"2025-12-17T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.877838 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.877897 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.877920 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.877943 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.877956 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:23Z","lastTransitionTime":"2025-12-17T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.980262 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.980349 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.980361 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.980379 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:23 crc kubenswrapper[4935]: I1217 09:05:23.980393 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:23Z","lastTransitionTime":"2025-12-17T09:05:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.082810 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.082869 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.082884 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.082906 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.082924 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:24Z","lastTransitionTime":"2025-12-17T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.123198 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.123305 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:24 crc kubenswrapper[4935]: E1217 09:05:24.123424 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.123500 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:24 crc kubenswrapper[4935]: E1217 09:05:24.123604 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:24 crc kubenswrapper[4935]: E1217 09:05:24.123762 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.185708 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.185781 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.185795 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.185814 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.185827 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:24Z","lastTransitionTime":"2025-12-17T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.289422 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.289489 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.289501 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.289525 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.289543 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:24Z","lastTransitionTime":"2025-12-17T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.393142 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.393217 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.393238 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.393312 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.393340 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:24Z","lastTransitionTime":"2025-12-17T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.496699 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.496747 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.496762 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.496780 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.496792 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:24Z","lastTransitionTime":"2025-12-17T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.599800 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.599868 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.599954 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.599995 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.600024 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:24Z","lastTransitionTime":"2025-12-17T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.703558 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.703625 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.703642 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.703668 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.703686 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:24Z","lastTransitionTime":"2025-12-17T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.812571 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.812681 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.812696 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.812730 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.812756 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:24Z","lastTransitionTime":"2025-12-17T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.916860 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.917008 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.917029 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.917060 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:24 crc kubenswrapper[4935]: I1217 09:05:24.917079 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:24Z","lastTransitionTime":"2025-12-17T09:05:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.020529 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.020618 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.020638 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.020671 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.020698 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:25Z","lastTransitionTime":"2025-12-17T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.123110 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.123156 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.123167 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.123173 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.123187 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:25 crc kubenswrapper[4935]: E1217 09:05:25.123361 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.123384 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:25Z","lastTransitionTime":"2025-12-17T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.225679 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.225732 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.225743 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.225759 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.225771 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:25Z","lastTransitionTime":"2025-12-17T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.328727 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.328797 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.328818 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.328848 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.328865 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:25Z","lastTransitionTime":"2025-12-17T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.431743 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.431796 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.431810 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.431832 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.431847 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:25Z","lastTransitionTime":"2025-12-17T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.536192 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.536252 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.536313 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.536341 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.536359 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:25Z","lastTransitionTime":"2025-12-17T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.564676 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:25 crc kubenswrapper[4935]: E1217 09:05:25.564902 4935 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:25 crc kubenswrapper[4935]: E1217 09:05:25.565020 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs podName:77feddc8-547a-42a0-baa3-19dd2915eb9f nodeName:}" failed. No retries permitted until 2025-12-17 09:05:33.564997379 +0000 UTC m=+53.224838142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs") pod "network-metrics-daemon-rg2z5" (UID: "77feddc8-547a-42a0-baa3-19dd2915eb9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.640420 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.640496 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.640512 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.640531 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.640543 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:25Z","lastTransitionTime":"2025-12-17T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.743034 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.743081 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.743094 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.743115 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.743133 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:25Z","lastTransitionTime":"2025-12-17T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.847070 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.847131 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.847145 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.847168 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.847181 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:25Z","lastTransitionTime":"2025-12-17T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.949907 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.949954 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.949965 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.949982 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:25 crc kubenswrapper[4935]: I1217 09:05:25.949995 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:25Z","lastTransitionTime":"2025-12-17T09:05:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.053462 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.053499 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.053511 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.053526 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.053536 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:26Z","lastTransitionTime":"2025-12-17T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.124156 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.124214 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.124449 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:26 crc kubenswrapper[4935]: E1217 09:05:26.124486 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:26 crc kubenswrapper[4935]: E1217 09:05:26.124619 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:26 crc kubenswrapper[4935]: E1217 09:05:26.124756 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.156645 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.156733 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.156758 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.156793 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.156818 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:26Z","lastTransitionTime":"2025-12-17T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.261114 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.261178 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.261198 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.261226 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.261244 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:26Z","lastTransitionTime":"2025-12-17T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.364948 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.365034 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.365189 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.365314 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.365343 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:26Z","lastTransitionTime":"2025-12-17T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.468296 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.468341 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.468352 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.468368 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.468380 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:26Z","lastTransitionTime":"2025-12-17T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.571422 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.571485 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.571497 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.571519 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.571536 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:26Z","lastTransitionTime":"2025-12-17T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.674215 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.674879 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.674963 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.675047 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.675120 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:26Z","lastTransitionTime":"2025-12-17T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.779212 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.779616 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.779778 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.779933 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.780106 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:26Z","lastTransitionTime":"2025-12-17T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.884026 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.884117 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.884141 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.884174 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.884201 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:26Z","lastTransitionTime":"2025-12-17T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.987973 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.988535 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.988734 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.989029 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:26 crc kubenswrapper[4935]: I1217 09:05:26.989239 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:26Z","lastTransitionTime":"2025-12-17T09:05:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.092026 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.092110 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.092130 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.092199 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.092220 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:27Z","lastTransitionTime":"2025-12-17T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.124107 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:27 crc kubenswrapper[4935]: E1217 09:05:27.124362 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.195210 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.195351 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.195369 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.195396 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.195413 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:27Z","lastTransitionTime":"2025-12-17T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.298211 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.298302 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.298314 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.298343 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.298361 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:27Z","lastTransitionTime":"2025-12-17T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.400815 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.400898 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.400910 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.400929 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.400944 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:27Z","lastTransitionTime":"2025-12-17T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.502944 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.502986 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.502995 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.503009 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.503019 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:27Z","lastTransitionTime":"2025-12-17T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.605513 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.605574 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.605584 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.605600 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.605609 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:27Z","lastTransitionTime":"2025-12-17T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.708143 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.708192 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.708208 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.708229 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.708242 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:27Z","lastTransitionTime":"2025-12-17T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.810810 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.810865 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.810887 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.810905 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.810918 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:27Z","lastTransitionTime":"2025-12-17T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.914177 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.914234 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.914248 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.914285 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:27 crc kubenswrapper[4935]: I1217 09:05:27.914298 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:27Z","lastTransitionTime":"2025-12-17T09:05:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.017308 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.017372 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.017383 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.017402 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.017414 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.120708 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.120754 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.120766 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.120783 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.120793 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.124041 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.124109 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.124131 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:28 crc kubenswrapper[4935]: E1217 09:05:28.124343 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:28 crc kubenswrapper[4935]: E1217 09:05:28.124450 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:28 crc kubenswrapper[4935]: E1217 09:05:28.124513 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.223737 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.223805 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.223817 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.223836 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.223848 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.326387 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.326436 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.326445 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.326461 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.326473 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.429090 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.429137 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.429147 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.429163 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.429176 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.532007 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.532109 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.532131 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.532160 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.532179 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.636226 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.636318 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.636334 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.636358 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.636372 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.670943 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.670997 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.671011 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.671036 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.671052 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: E1217 09:05:28.690142 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:28Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.695347 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.695431 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.695451 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.695479 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.695501 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: E1217 09:05:28.713629 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:28Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.718970 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.719035 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.719061 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.719095 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.719118 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: E1217 09:05:28.734299 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:28Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.739153 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.739230 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.739315 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.739339 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.739358 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: E1217 09:05:28.754930 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:28Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.759242 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.760355 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.760419 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.760443 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.760461 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: E1217 09:05:28.775515 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:28Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:28 crc kubenswrapper[4935]: E1217 09:05:28.776083 4935 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.779495 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.779555 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.779567 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.779594 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.779610 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.883149 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.883463 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.883596 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.883702 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.883808 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.986111 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.986158 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.986170 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.986189 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:28 crc kubenswrapper[4935]: I1217 09:05:28.986201 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:28Z","lastTransitionTime":"2025-12-17T09:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.093057 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.093417 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.093592 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.093802 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.093949 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:29Z","lastTransitionTime":"2025-12-17T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.123822 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:29 crc kubenswrapper[4935]: E1217 09:05:29.123972 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.196400 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.196707 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.196804 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.196891 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.196984 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:29Z","lastTransitionTime":"2025-12-17T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.300093 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.300141 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.300150 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.300166 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.300175 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:29Z","lastTransitionTime":"2025-12-17T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.404469 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.404525 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.404534 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.404556 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.404567 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:29Z","lastTransitionTime":"2025-12-17T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.507487 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.507785 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.507857 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.507934 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.507992 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:29Z","lastTransitionTime":"2025-12-17T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.610556 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.610598 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.610608 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.610623 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.610635 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:29Z","lastTransitionTime":"2025-12-17T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.713308 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.713345 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.713354 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.713371 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.713382 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:29Z","lastTransitionTime":"2025-12-17T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.818684 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.818735 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.818747 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.818764 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.818777 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:29Z","lastTransitionTime":"2025-12-17T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.922927 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.922975 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.922986 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.923009 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:29 crc kubenswrapper[4935]: I1217 09:05:29.923021 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:29Z","lastTransitionTime":"2025-12-17T09:05:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.025686 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.025733 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.025741 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.025758 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.025768 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:30Z","lastTransitionTime":"2025-12-17T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.123878 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.123985 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.123881 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:30 crc kubenswrapper[4935]: E1217 09:05:30.124023 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:30 crc kubenswrapper[4935]: E1217 09:05:30.124196 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:30 crc kubenswrapper[4935]: E1217 09:05:30.124362 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.128653 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.128706 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.128720 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.128740 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.128752 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:30Z","lastTransitionTime":"2025-12-17T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.231953 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.231989 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.231997 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.232013 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.232023 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:30Z","lastTransitionTime":"2025-12-17T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.334964 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.335005 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.335014 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.335030 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.335041 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:30Z","lastTransitionTime":"2025-12-17T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.438513 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.438561 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.438591 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.438611 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.438627 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:30Z","lastTransitionTime":"2025-12-17T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.542150 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.542200 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.542213 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.542233 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.542244 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:30Z","lastTransitionTime":"2025-12-17T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.645196 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.645264 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.645331 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.645353 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.645366 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:30Z","lastTransitionTime":"2025-12-17T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.749305 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.749411 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.749456 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.749494 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.749517 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:30Z","lastTransitionTime":"2025-12-17T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.852315 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.852369 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.852379 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.852397 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.852411 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:30Z","lastTransitionTime":"2025-12-17T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.955495 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.955597 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.955621 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.955657 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:30 crc kubenswrapper[4935]: I1217 09:05:30.955683 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:30Z","lastTransitionTime":"2025-12-17T09:05:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.059109 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.059156 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.059164 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.059182 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.059193 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:31Z","lastTransitionTime":"2025-12-17T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.123799 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:31 crc kubenswrapper[4935]: E1217 09:05:31.123996 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.141179 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.161545 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.161598 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.161607 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.161624 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.161634 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:31Z","lastTransitionTime":"2025-12-17T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.164139 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.179424 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.180482 4935 scope.go:117] "RemoveContainer" containerID="d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.181358 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.195437 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.212301 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.229431 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.243743 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.258127 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.265570 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.265639 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.265657 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.265686 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.265704 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:31Z","lastTransitionTime":"2025-12-17T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.271760 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.286788 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.300127 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.330875 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.359655 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.368425 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.368480 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.368491 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.368509 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.368520 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:31Z","lastTransitionTime":"2025-12-17T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.382520 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.397487 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.423523 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1eefdd18d046a121114ac69d32feebe7bf7c20827ac12f6f5ef0734e994213e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:14Z\\\",\\\"message\\\":\\\" 6220 handler.go:208] Removed *v1.Node event handler 2\\\\nI1217 09:05:14.792050 6220 handler.go:208] Removed *v1.Node event handler 7\\\\nI1217 09:05:14.792148 6220 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.792462 6220 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792572 6220 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.792840 6220 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1217 09:05:14.793089 6220 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1217 09:05:14.793260 6220 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1217 09:05:14.793444 6220 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:19Z\\\",\\\"message\\\":\\\":17.223558 6366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223544 6366 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1217 09:05:17.223574 6366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223583 6366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-qzmn2 in node crc\\\\nI1217 09:05:17.223590 6366 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2 after 0 failed attempt(s)\\\\nI1217 09:05:17.223595 6366 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-pl\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.438546 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.456564 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:19Z\\\",\\\"message\\\":\\\":17.223558 6366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223544 6366 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1217 09:05:17.223574 6366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223583 6366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-qzmn2 in node crc\\\\nI1217 09:05:17.223590 6366 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2 after 0 failed attempt(s)\\\\nI1217 09:05:17.223595 6366 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-pl\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.468090 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.470937 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.471076 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.471137 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.471206 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.471299 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:31Z","lastTransitionTime":"2025-12-17T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.487486 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.499458 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.502403 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/1.log" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.509995 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.513144 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.526411 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.539443 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.553584 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.567114 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.574531 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.574574 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.574585 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.574604 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.574615 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:31Z","lastTransitionTime":"2025-12-17T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.581555 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.599603 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.618155 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.632572 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.653308 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.669841 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.676432 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.676469 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.676480 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.676500 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.676512 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:31Z","lastTransitionTime":"2025-12-17T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.779045 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.779091 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.779102 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.779121 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.779138 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:31Z","lastTransitionTime":"2025-12-17T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.866617 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.876384 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.881590 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.881640 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.881658 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.881681 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.881697 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:31Z","lastTransitionTime":"2025-12-17T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.884320 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.910772 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:19Z\\\",\\\"message\\\":\\\":17.223558 6366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223544 6366 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1217 09:05:17.223574 6366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223583 6366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-qzmn2 in node crc\\\\nI1217 09:05:17.223590 6366 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2 after 0 failed attempt(s)\\\\nI1217 09:05:17.223595 6366 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-pl\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.926950 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.942453 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.956094 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.967659 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.981333 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.983974 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.984002 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.984014 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.984031 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.984044 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:31Z","lastTransitionTime":"2025-12-17T09:05:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:31 crc kubenswrapper[4935]: I1217 09:05:31.995975 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:31Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.009241 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.022576 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.039014 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.051605 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.063215 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.075640 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.086451 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.086480 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.086492 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.086510 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.086522 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:32Z","lastTransitionTime":"2025-12-17T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.087555 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.099390 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.123259 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.123319 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.123304 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:32 crc kubenswrapper[4935]: E1217 09:05:32.123470 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:32 crc kubenswrapper[4935]: E1217 09:05:32.123616 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:32 crc kubenswrapper[4935]: E1217 09:05:32.123719 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.190211 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.190268 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.190301 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.190323 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.190337 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:32Z","lastTransitionTime":"2025-12-17T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.293588 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.293651 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.293674 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.293704 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.293724 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:32Z","lastTransitionTime":"2025-12-17T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.397025 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.397105 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.397125 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.397155 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.397175 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:32Z","lastTransitionTime":"2025-12-17T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.500744 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.500822 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.500847 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.500880 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.500905 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:32Z","lastTransitionTime":"2025-12-17T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.515055 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.539963 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.585753 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:19Z\\\",\\\"message\\\":\\\":17.223558 6366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223544 6366 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1217 09:05:17.223574 6366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223583 6366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-qzmn2 in node crc\\\\nI1217 09:05:17.223590 6366 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2 after 0 failed attempt(s)\\\\nI1217 09:05:17.223595 6366 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-pl\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.604573 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.604659 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.604680 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.604711 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.604734 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:32Z","lastTransitionTime":"2025-12-17T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.606345 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.626434 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.643543 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.662701 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.688729 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.706196 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.708188 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.708287 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.708300 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.708321 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.708338 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:32Z","lastTransitionTime":"2025-12-17T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.720503 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.736099 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.750304 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.763254 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.781966 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.798143 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.811027 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.811083 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.811094 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.811114 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.811128 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:32Z","lastTransitionTime":"2025-12-17T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.815543 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.831192 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.846032 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:32Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.914435 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.914493 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.914506 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.914529 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:32 crc kubenswrapper[4935]: I1217 09:05:32.914542 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:32Z","lastTransitionTime":"2025-12-17T09:05:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.018065 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.018156 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.018170 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.018193 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.018212 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:33Z","lastTransitionTime":"2025-12-17T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.121195 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.121260 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.121296 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.121324 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.121341 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:33Z","lastTransitionTime":"2025-12-17T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.123568 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:33 crc kubenswrapper[4935]: E1217 09:05:33.123761 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.224489 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.224573 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.224592 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.224622 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.224641 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:33Z","lastTransitionTime":"2025-12-17T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.327489 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.327559 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.327578 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.327610 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.327640 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:33Z","lastTransitionTime":"2025-12-17T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.438821 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.438885 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.438894 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.438927 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.438938 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:33Z","lastTransitionTime":"2025-12-17T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.525839 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/2.log" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.526535 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/1.log" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.529910 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520" exitCode=1 Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.529959 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.530015 4935 scope.go:117] "RemoveContainer" containerID="d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.530945 4935 scope.go:117] "RemoveContainer" containerID="ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520" Dec 17 09:05:33 crc kubenswrapper[4935]: E1217 09:05:33.531204 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.541399 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.541434 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.541446 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.541461 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.541474 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:33Z","lastTransitionTime":"2025-12-17T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.549456 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.582639 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7104c2c495e1e2605c2a408a543d8f4a4f40b0e1decf4e0566c4261c1fd890a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:19Z\\\",\\\"message\\\":\\\":17.223558 6366 obj_retry.go:303] Retry object setup: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223544 6366 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-operator]} name:Service_openshift-machine-api/machine-api-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.21:8443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {58a148b3-0a7b-4412-b447-f87788c4883f}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI1217 09:05:17.223574 6366 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2\\\\nI1217 09:05:17.223583 6366 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-additional-cni-plugins-qzmn2 in node crc\\\\nI1217 09:05:17.223590 6366 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-additional-cni-plugins-qzmn2 after 0 failed attempt(s)\\\\nI1217 09:05:17.223595 6366 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-additional-cni-pl\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:32Z\\\",\\\"message\\\":\\\"r *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1217 09:05:32.134741 6560 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1217 09:05:32.134758 6560 lb_config.go:1031] Cluster endpoints for openshift-controller-manager/controller-manager for network=default are: map[]\\\\nI1217 09:05:32.134779 6560 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134788 6560 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134761 6560 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:05:32.134797 6560 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1217 09:05:32.134833 6560 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1217 09:05:32.134881 6560 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.600365 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.619517 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.633476 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.644922 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.644982 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.644999 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.645029 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.645046 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:33Z","lastTransitionTime":"2025-12-17T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.650796 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.659035 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:33 crc kubenswrapper[4935]: E1217 09:05:33.659341 4935 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:33 crc kubenswrapper[4935]: E1217 09:05:33.659408 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs podName:77feddc8-547a-42a0-baa3-19dd2915eb9f nodeName:}" failed. No retries permitted until 2025-12-17 09:05:49.659387657 +0000 UTC m=+69.319228420 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs") pod "network-metrics-daemon-rg2z5" (UID: "77feddc8-547a-42a0-baa3-19dd2915eb9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.671132 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.685327 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.701541 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.717652 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.728812 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.744988 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.747995 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.748034 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.748051 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.748074 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.748090 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:33Z","lastTransitionTime":"2025-12-17T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.764743 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.780038 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.796069 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.814130 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.828743 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:33Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.852003 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.852245 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.852407 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.852501 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.852618 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:33Z","lastTransitionTime":"2025-12-17T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.956459 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.956538 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.956561 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.956593 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.956621 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:33Z","lastTransitionTime":"2025-12-17T09:05:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:33 crc kubenswrapper[4935]: I1217 09:05:33.962989 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:05:33 crc kubenswrapper[4935]: E1217 09:05:33.963255 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:06:05.963219959 +0000 UTC m=+85.623060752 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.060930 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.061531 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.061622 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.061724 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.061838 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:34Z","lastTransitionTime":"2025-12-17T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.064495 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.064557 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.064619 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.064644 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.064794 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.064828 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.064840 4935 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.064903 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-17 09:06:06.06488295 +0000 UTC m=+85.724723713 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.065167 4935 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.065256 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:06:06.065229858 +0000 UTC m=+85.725070631 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.065340 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.065362 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.065375 4935 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.065424 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-17 09:06:06.065412453 +0000 UTC m=+85.725253426 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.065558 4935 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.065687 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:06:06.065673199 +0000 UTC m=+85.725513972 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.124020 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.124177 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.124040 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.124546 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.124040 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.124927 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.164314 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.164383 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.164394 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.164413 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.164426 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:34Z","lastTransitionTime":"2025-12-17T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.267531 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.267640 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.267662 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.267693 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.267712 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:34Z","lastTransitionTime":"2025-12-17T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.371002 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.371086 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.371105 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.371133 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.371159 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:34Z","lastTransitionTime":"2025-12-17T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.475005 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.475066 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.475084 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.475109 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.475127 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:34Z","lastTransitionTime":"2025-12-17T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.538140 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/2.log" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.544553 4935 scope.go:117] "RemoveContainer" containerID="ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520" Dec 17 09:05:34 crc kubenswrapper[4935]: E1217 09:05:34.544863 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.564361 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.578163 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.578254 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.578309 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.578342 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.578362 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:34Z","lastTransitionTime":"2025-12-17T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.592280 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.608753 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.625293 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.640080 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.654450 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.669608 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.681458 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.681540 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.681568 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.681597 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.681616 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:34Z","lastTransitionTime":"2025-12-17T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.686089 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.698692 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.713353 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.729056 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.748226 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.764364 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.782416 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.784560 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.784611 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.784630 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.784658 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.784678 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:34Z","lastTransitionTime":"2025-12-17T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.796320 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.814171 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.841481 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:32Z\\\",\\\"message\\\":\\\"r *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1217 09:05:32.134741 6560 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1217 09:05:32.134758 6560 lb_config.go:1031] Cluster endpoints for openshift-controller-manager/controller-manager for network=default are: map[]\\\\nI1217 09:05:32.134779 6560 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134788 6560 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134761 6560 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:05:32.134797 6560 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1217 09:05:32.134833 6560 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1217 09:05:32.134881 6560 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:34Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.886992 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.887098 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.887118 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.887144 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.887162 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:34Z","lastTransitionTime":"2025-12-17T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.991446 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.991519 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.991538 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.991566 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:34 crc kubenswrapper[4935]: I1217 09:05:34.991587 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:34Z","lastTransitionTime":"2025-12-17T09:05:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.094796 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.095253 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.095522 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.095741 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.095941 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:35Z","lastTransitionTime":"2025-12-17T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.123525 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:35 crc kubenswrapper[4935]: E1217 09:05:35.124089 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.199148 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.199203 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.199212 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.199230 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.199240 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:35Z","lastTransitionTime":"2025-12-17T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.302286 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.302336 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.302346 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.302362 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.302373 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:35Z","lastTransitionTime":"2025-12-17T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.405778 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.405829 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.405846 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.405866 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.405878 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:35Z","lastTransitionTime":"2025-12-17T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.508819 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.508860 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.508869 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.508884 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.508895 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:35Z","lastTransitionTime":"2025-12-17T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.611431 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.611483 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.611492 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.611513 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.611524 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:35Z","lastTransitionTime":"2025-12-17T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.714639 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.714683 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.714695 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.714713 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.714726 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:35Z","lastTransitionTime":"2025-12-17T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.816785 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.816831 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.816843 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.816861 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.816874 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:35Z","lastTransitionTime":"2025-12-17T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.919942 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.919996 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.920009 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.920031 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:35 crc kubenswrapper[4935]: I1217 09:05:35.920044 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:35Z","lastTransitionTime":"2025-12-17T09:05:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.022579 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.022618 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.022627 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.022641 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.022650 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:36Z","lastTransitionTime":"2025-12-17T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.123966 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.124014 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.124014 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:36 crc kubenswrapper[4935]: E1217 09:05:36.124510 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:36 crc kubenswrapper[4935]: E1217 09:05:36.124668 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:36 crc kubenswrapper[4935]: E1217 09:05:36.124773 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.125525 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.125570 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.125584 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.125603 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.125619 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:36Z","lastTransitionTime":"2025-12-17T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.229456 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.229507 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.229519 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.229594 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.229620 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:36Z","lastTransitionTime":"2025-12-17T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.333087 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.333139 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.333150 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.333169 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.333181 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:36Z","lastTransitionTime":"2025-12-17T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.436656 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.436710 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.436726 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.436748 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.436763 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:36Z","lastTransitionTime":"2025-12-17T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.539533 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.539596 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.539620 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.539653 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.539675 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:36Z","lastTransitionTime":"2025-12-17T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.643441 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.643509 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.643533 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.643564 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.643588 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:36Z","lastTransitionTime":"2025-12-17T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.747026 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.747107 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.747119 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.747140 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.747151 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:36Z","lastTransitionTime":"2025-12-17T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.849891 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.849945 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.849961 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.849981 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.849995 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:36Z","lastTransitionTime":"2025-12-17T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.952961 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.953016 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.953028 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.953049 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:36 crc kubenswrapper[4935]: I1217 09:05:36.953061 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:36Z","lastTransitionTime":"2025-12-17T09:05:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.055780 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.055832 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.055842 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.055860 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.055873 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:37Z","lastTransitionTime":"2025-12-17T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.123846 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:37 crc kubenswrapper[4935]: E1217 09:05:37.124054 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.158939 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.159007 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.159022 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.159050 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.159070 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:37Z","lastTransitionTime":"2025-12-17T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.262580 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.262649 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.262666 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.262692 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.262712 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:37Z","lastTransitionTime":"2025-12-17T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.366886 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.366926 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.366940 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.366956 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.366969 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:37Z","lastTransitionTime":"2025-12-17T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.470317 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.470370 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.470382 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.470401 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.470417 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:37Z","lastTransitionTime":"2025-12-17T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.573405 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.573467 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.573481 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.573502 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.573521 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:37Z","lastTransitionTime":"2025-12-17T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.676744 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.676820 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.676838 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.676863 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.676884 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:37Z","lastTransitionTime":"2025-12-17T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.779501 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.779534 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.779541 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.779559 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.779569 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:37Z","lastTransitionTime":"2025-12-17T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.882470 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.882517 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.882531 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.882550 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.882564 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:37Z","lastTransitionTime":"2025-12-17T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.984823 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.984891 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.984911 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.984936 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:37 crc kubenswrapper[4935]: I1217 09:05:37.984956 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:37Z","lastTransitionTime":"2025-12-17T09:05:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.088482 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.088535 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.088549 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.088566 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.088584 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.123416 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.123566 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:38 crc kubenswrapper[4935]: E1217 09:05:38.123750 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.123761 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:38 crc kubenswrapper[4935]: E1217 09:05:38.123814 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:38 crc kubenswrapper[4935]: E1217 09:05:38.124011 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.192139 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.192189 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.192198 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.192215 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.192228 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.295251 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.295388 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.295408 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.295438 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.295459 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.399367 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.399476 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.399491 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.399515 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.399528 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.503711 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.503766 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.503795 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.503817 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.503832 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.607028 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.607113 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.607141 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.607179 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.607199 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.711582 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.711664 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.711684 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.711717 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.711738 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.815224 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.815308 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.815323 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.815345 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.815358 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.878086 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.878172 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.878192 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.878220 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.878239 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: E1217 09:05:38.891734 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:38Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.897536 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.897589 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.897603 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.897623 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.897639 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: E1217 09:05:38.912897 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:38Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.917678 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.917720 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.917734 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.917753 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.917764 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: E1217 09:05:38.934902 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:38Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.939613 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.939680 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.939691 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.939711 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.939726 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: E1217 09:05:38.954903 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:38Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.959810 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.959877 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.959894 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.959924 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.959944 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:38 crc kubenswrapper[4935]: E1217 09:05:38.976807 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:38Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:38 crc kubenswrapper[4935]: E1217 09:05:38.976922 4935 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.979657 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.979715 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.979729 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.979762 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:38 crc kubenswrapper[4935]: I1217 09:05:38.979785 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:38Z","lastTransitionTime":"2025-12-17T09:05:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.083703 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.083843 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.083863 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.083892 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.083911 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:39Z","lastTransitionTime":"2025-12-17T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.123776 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:39 crc kubenswrapper[4935]: E1217 09:05:39.123963 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.186568 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.186623 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.186637 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.186662 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.186678 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:39Z","lastTransitionTime":"2025-12-17T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.289705 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.289762 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.289775 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.289794 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.289808 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:39Z","lastTransitionTime":"2025-12-17T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.393320 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.393378 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.393392 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.393410 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.393420 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:39Z","lastTransitionTime":"2025-12-17T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.495758 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.495837 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.495856 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.495893 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.495911 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:39Z","lastTransitionTime":"2025-12-17T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.599120 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.599201 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.599217 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.599242 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.599335 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:39Z","lastTransitionTime":"2025-12-17T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.703320 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.703386 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.703402 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.703422 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.703439 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:39Z","lastTransitionTime":"2025-12-17T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.808900 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.808950 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.808962 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.808981 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.808995 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:39Z","lastTransitionTime":"2025-12-17T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.913574 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.913665 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.913719 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.913747 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:39 crc kubenswrapper[4935]: I1217 09:05:39.913765 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:39Z","lastTransitionTime":"2025-12-17T09:05:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.017248 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.017732 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.017777 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.017801 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.017819 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:40Z","lastTransitionTime":"2025-12-17T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.122062 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.122134 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.122153 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.122181 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.122199 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:40Z","lastTransitionTime":"2025-12-17T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.123408 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.123408 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.123645 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:40 crc kubenswrapper[4935]: E1217 09:05:40.123773 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:40 crc kubenswrapper[4935]: E1217 09:05:40.123872 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:40 crc kubenswrapper[4935]: E1217 09:05:40.124141 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.225957 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.226013 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.226033 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.226060 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.226079 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:40Z","lastTransitionTime":"2025-12-17T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.329610 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.329829 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.329851 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.329884 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.329905 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:40Z","lastTransitionTime":"2025-12-17T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.434228 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.434320 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.434330 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.434351 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.434363 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:40Z","lastTransitionTime":"2025-12-17T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.538590 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.538660 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.538682 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.538713 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.538733 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:40Z","lastTransitionTime":"2025-12-17T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.642536 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.642635 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.642656 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.642685 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.642705 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:40Z","lastTransitionTime":"2025-12-17T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.746947 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.747014 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.747030 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.747053 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.747068 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:40Z","lastTransitionTime":"2025-12-17T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.851072 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.851137 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.851149 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.851169 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.851183 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:40Z","lastTransitionTime":"2025-12-17T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.954905 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.954974 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.954994 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.955020 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:40 crc kubenswrapper[4935]: I1217 09:05:40.955039 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:40Z","lastTransitionTime":"2025-12-17T09:05:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.057948 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.058026 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.058040 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.058063 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.058078 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:41Z","lastTransitionTime":"2025-12-17T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.123767 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:41 crc kubenswrapper[4935]: E1217 09:05:41.123936 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.144155 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.162193 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.162401 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.162447 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.162459 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.162482 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.162496 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:41Z","lastTransitionTime":"2025-12-17T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.178577 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.202024 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:32Z\\\",\\\"message\\\":\\\"r *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1217 09:05:32.134741 6560 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1217 09:05:32.134758 6560 lb_config.go:1031] Cluster endpoints for openshift-controller-manager/controller-manager for network=default are: map[]\\\\nI1217 09:05:32.134779 6560 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134788 6560 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134761 6560 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:05:32.134797 6560 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1217 09:05:32.134833 6560 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1217 09:05:32.134881 6560 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.218227 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.233448 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.252586 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.265865 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.265921 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.265936 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.265960 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.265976 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:41Z","lastTransitionTime":"2025-12-17T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.271750 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.288154 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.307041 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.323800 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.339750 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.355010 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.369148 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.369220 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.369236 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.369259 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.369292 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:41Z","lastTransitionTime":"2025-12-17T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.369153 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.380484 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.394611 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.412844 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:41Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.471668 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.471731 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.471741 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.471762 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.471776 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:41Z","lastTransitionTime":"2025-12-17T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.574328 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.574386 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.574400 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.574424 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.574443 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:41Z","lastTransitionTime":"2025-12-17T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.677187 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.677315 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.677328 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.677348 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.677361 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:41Z","lastTransitionTime":"2025-12-17T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.783978 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.784085 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.784116 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.784149 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.784167 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:41Z","lastTransitionTime":"2025-12-17T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.887683 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.887759 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.887782 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.887815 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.887839 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:41Z","lastTransitionTime":"2025-12-17T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.990510 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.990563 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.990580 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.990606 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:41 crc kubenswrapper[4935]: I1217 09:05:41.990624 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:41Z","lastTransitionTime":"2025-12-17T09:05:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.095121 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.095182 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.095209 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.095232 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.095248 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:42Z","lastTransitionTime":"2025-12-17T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.124142 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.124154 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:42 crc kubenswrapper[4935]: E1217 09:05:42.124362 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.124403 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:42 crc kubenswrapper[4935]: E1217 09:05:42.124534 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:42 crc kubenswrapper[4935]: E1217 09:05:42.124618 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.198918 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.198975 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.198988 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.199015 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.199029 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:42Z","lastTransitionTime":"2025-12-17T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.302569 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.302638 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.302656 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.302682 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.302701 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:42Z","lastTransitionTime":"2025-12-17T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.406701 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.406774 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.406792 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.406820 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.406837 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:42Z","lastTransitionTime":"2025-12-17T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.509573 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.509647 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.509674 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.509707 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.509733 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:42Z","lastTransitionTime":"2025-12-17T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.613085 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.613131 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.613144 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.613163 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.613177 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:42Z","lastTransitionTime":"2025-12-17T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.716447 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.716525 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.716545 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.716572 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.716595 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:42Z","lastTransitionTime":"2025-12-17T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.819736 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.819811 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.819828 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.819858 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.819876 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:42Z","lastTransitionTime":"2025-12-17T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.922158 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.922200 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.922209 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.922225 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:42 crc kubenswrapper[4935]: I1217 09:05:42.922237 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:42Z","lastTransitionTime":"2025-12-17T09:05:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.026323 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.026396 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.026442 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.026494 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.026522 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:43Z","lastTransitionTime":"2025-12-17T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.124214 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:43 crc kubenswrapper[4935]: E1217 09:05:43.124507 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.130498 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.130561 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.130583 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.130682 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.130708 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:43Z","lastTransitionTime":"2025-12-17T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.234634 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.234707 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.234725 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.234752 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.234770 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:43Z","lastTransitionTime":"2025-12-17T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.337610 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.337696 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.337714 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.337741 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.337760 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:43Z","lastTransitionTime":"2025-12-17T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.441415 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.441533 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.441548 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.441572 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.441588 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:43Z","lastTransitionTime":"2025-12-17T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.544969 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.545032 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.545049 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.545078 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.545094 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:43Z","lastTransitionTime":"2025-12-17T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.648013 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.648112 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.648126 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.648152 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.648170 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:43Z","lastTransitionTime":"2025-12-17T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.752877 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.752955 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.752975 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.753003 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.753021 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:43Z","lastTransitionTime":"2025-12-17T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.856056 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.856100 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.856112 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.856156 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.856171 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:43Z","lastTransitionTime":"2025-12-17T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.958821 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.958924 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.958957 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.958987 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:43 crc kubenswrapper[4935]: I1217 09:05:43.959009 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:43Z","lastTransitionTime":"2025-12-17T09:05:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.062355 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.062454 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.062464 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.062479 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.062492 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:44Z","lastTransitionTime":"2025-12-17T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.123875 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.123946 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.124053 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:44 crc kubenswrapper[4935]: E1217 09:05:44.124129 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:44 crc kubenswrapper[4935]: E1217 09:05:44.124375 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:44 crc kubenswrapper[4935]: E1217 09:05:44.124576 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.165305 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.165379 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.165392 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.165412 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.165422 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:44Z","lastTransitionTime":"2025-12-17T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.272832 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.272885 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.272895 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.272917 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.272929 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:44Z","lastTransitionTime":"2025-12-17T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.375024 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.375102 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.375121 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.375150 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.375170 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:44Z","lastTransitionTime":"2025-12-17T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.478701 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.478744 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.478756 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.478774 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.478787 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:44Z","lastTransitionTime":"2025-12-17T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.582458 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.582553 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.582579 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.582645 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.582669 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:44Z","lastTransitionTime":"2025-12-17T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.687208 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.687259 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.687295 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.687316 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.687329 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:44Z","lastTransitionTime":"2025-12-17T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.790965 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.791073 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.791155 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.791248 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.791321 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:44Z","lastTransitionTime":"2025-12-17T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.894343 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.894383 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.894394 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.894409 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.894419 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:44Z","lastTransitionTime":"2025-12-17T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.997687 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.997749 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.997767 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.997794 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:44 crc kubenswrapper[4935]: I1217 09:05:44.997821 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:44Z","lastTransitionTime":"2025-12-17T09:05:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.100654 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.100780 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.100799 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.100828 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.100847 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:45Z","lastTransitionTime":"2025-12-17T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.123533 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:45 crc kubenswrapper[4935]: E1217 09:05:45.123766 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.204189 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.204238 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.204249 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.204267 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.204299 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:45Z","lastTransitionTime":"2025-12-17T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.307205 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.307263 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.307302 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.307323 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.307335 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:45Z","lastTransitionTime":"2025-12-17T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.410179 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.410229 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.410242 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.410262 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.410290 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:45Z","lastTransitionTime":"2025-12-17T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.513490 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.513543 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.513559 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.513584 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.513600 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:45Z","lastTransitionTime":"2025-12-17T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.616847 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.616898 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.616911 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.616933 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.616948 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:45Z","lastTransitionTime":"2025-12-17T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.719665 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.719719 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.719729 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.719749 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.719762 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:45Z","lastTransitionTime":"2025-12-17T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.822511 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.822576 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.822591 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.822615 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.822628 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:45Z","lastTransitionTime":"2025-12-17T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.925486 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.925541 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.925552 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.925575 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:45 crc kubenswrapper[4935]: I1217 09:05:45.925588 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:45Z","lastTransitionTime":"2025-12-17T09:05:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.028554 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.028612 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.028623 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.028642 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.028654 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:46Z","lastTransitionTime":"2025-12-17T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.123766 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.123766 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:46 crc kubenswrapper[4935]: E1217 09:05:46.123926 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.123791 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:46 crc kubenswrapper[4935]: E1217 09:05:46.123984 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:46 crc kubenswrapper[4935]: E1217 09:05:46.124073 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.132007 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.132097 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.132128 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.132163 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.132192 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:46Z","lastTransitionTime":"2025-12-17T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.234627 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.234680 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.234694 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.234715 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.234732 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:46Z","lastTransitionTime":"2025-12-17T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.337539 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.338049 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.338061 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.338076 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.338088 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:46Z","lastTransitionTime":"2025-12-17T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.441117 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.441181 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.441200 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.441218 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.441727 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:46Z","lastTransitionTime":"2025-12-17T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.544615 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.544665 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.544675 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.544694 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.544704 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:46Z","lastTransitionTime":"2025-12-17T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.647820 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.647856 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.647866 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.647882 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.647892 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:46Z","lastTransitionTime":"2025-12-17T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.750832 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.750902 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.750913 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.750932 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.750965 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:46Z","lastTransitionTime":"2025-12-17T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.853654 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.853700 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.853709 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.853724 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.853734 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:46Z","lastTransitionTime":"2025-12-17T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.956021 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.956070 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.956081 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.956102 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:46 crc kubenswrapper[4935]: I1217 09:05:46.956113 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:46Z","lastTransitionTime":"2025-12-17T09:05:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.058697 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.058740 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.058750 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.058764 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.058776 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:47Z","lastTransitionTime":"2025-12-17T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.123619 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:47 crc kubenswrapper[4935]: E1217 09:05:47.123821 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.161537 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.162141 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.162176 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.162221 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.162245 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:47Z","lastTransitionTime":"2025-12-17T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.265122 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.265202 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.265211 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.265234 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.265246 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:47Z","lastTransitionTime":"2025-12-17T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.368200 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.368256 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.368266 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.368301 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.368351 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:47Z","lastTransitionTime":"2025-12-17T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.472112 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.472208 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.472221 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.472241 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.472255 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:47Z","lastTransitionTime":"2025-12-17T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.575146 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.575188 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.575196 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.575217 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.575228 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:47Z","lastTransitionTime":"2025-12-17T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.678214 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.678264 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.678311 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.678334 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.678348 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:47Z","lastTransitionTime":"2025-12-17T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.781383 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.781449 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.781461 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.781483 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.781898 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:47Z","lastTransitionTime":"2025-12-17T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.885386 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.885458 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.885475 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.885504 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.885522 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:47Z","lastTransitionTime":"2025-12-17T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.989074 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.989159 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.989185 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.989219 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:47 crc kubenswrapper[4935]: I1217 09:05:47.989238 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:47Z","lastTransitionTime":"2025-12-17T09:05:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.093005 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.093079 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.093104 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.093139 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.093165 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:48Z","lastTransitionTime":"2025-12-17T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.124142 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.124220 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:48 crc kubenswrapper[4935]: E1217 09:05:48.124321 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.124388 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:48 crc kubenswrapper[4935]: E1217 09:05:48.124482 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:48 crc kubenswrapper[4935]: E1217 09:05:48.124970 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.125223 4935 scope.go:117] "RemoveContainer" containerID="ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520" Dec 17 09:05:48 crc kubenswrapper[4935]: E1217 09:05:48.125546 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.196297 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.196677 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.196770 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.196860 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.196931 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:48Z","lastTransitionTime":"2025-12-17T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.300189 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.300255 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.300297 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.300324 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.300343 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:48Z","lastTransitionTime":"2025-12-17T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.403267 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.403356 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.403366 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.403403 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.403423 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:48Z","lastTransitionTime":"2025-12-17T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.506221 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.506267 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.506308 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.506351 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.506364 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:48Z","lastTransitionTime":"2025-12-17T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.609469 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.609575 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.609595 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.609628 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.609648 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:48Z","lastTransitionTime":"2025-12-17T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.713996 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.714067 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.714089 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.714120 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.714139 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:48Z","lastTransitionTime":"2025-12-17T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.817315 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.817606 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.817687 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.817774 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.817837 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:48Z","lastTransitionTime":"2025-12-17T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.920224 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.920301 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.920313 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.920334 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:48 crc kubenswrapper[4935]: I1217 09:05:48.920352 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:48Z","lastTransitionTime":"2025-12-17T09:05:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.022556 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.022619 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.022630 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.022647 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.022660 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.023555 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.023592 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.023606 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.023622 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.023635 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: E1217 09:05:49.040707 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:49Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.046062 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.046118 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.046130 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.046148 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.046160 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: E1217 09:05:49.061531 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:49Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.065789 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.065844 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.065853 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.065873 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.065887 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: E1217 09:05:49.083754 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:49Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.088081 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.088129 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.088144 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.088167 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.088184 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: E1217 09:05:49.103128 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:49Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.107643 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.107685 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.107699 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.107723 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.107737 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: E1217 09:05:49.122731 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:49Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:49 crc kubenswrapper[4935]: E1217 09:05:49.123222 4935 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.123584 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:49 crc kubenswrapper[4935]: E1217 09:05:49.123757 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.125684 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.125735 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.125747 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.125766 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.125784 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.229499 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.229540 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.229552 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.229575 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.229587 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.333369 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.334021 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.334051 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.334074 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.334088 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.436998 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.437042 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.437054 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.437073 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.437084 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.540684 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.540766 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.540776 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.540849 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.540920 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.644113 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.644164 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.644178 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.644199 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.644214 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.661425 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:49 crc kubenswrapper[4935]: E1217 09:05:49.661695 4935 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:49 crc kubenswrapper[4935]: E1217 09:05:49.661820 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs podName:77feddc8-547a-42a0-baa3-19dd2915eb9f nodeName:}" failed. No retries permitted until 2025-12-17 09:06:21.661792013 +0000 UTC m=+101.321632776 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs") pod "network-metrics-daemon-rg2z5" (UID: "77feddc8-547a-42a0-baa3-19dd2915eb9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.747833 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.747881 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.747895 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.747914 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.747927 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.850698 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.850739 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.850747 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.850766 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.850778 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.953392 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.953442 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.953456 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.953472 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:49 crc kubenswrapper[4935]: I1217 09:05:49.953484 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:49Z","lastTransitionTime":"2025-12-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.056193 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.056259 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.056313 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.056344 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.056364 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:50Z","lastTransitionTime":"2025-12-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.123763 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.123853 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.123963 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:50 crc kubenswrapper[4935]: E1217 09:05:50.123964 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:50 crc kubenswrapper[4935]: E1217 09:05:50.124156 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:50 crc kubenswrapper[4935]: E1217 09:05:50.124317 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.159071 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.159149 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.159162 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.159181 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.159196 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:50Z","lastTransitionTime":"2025-12-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.261184 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.261227 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.261238 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.261257 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.261292 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:50Z","lastTransitionTime":"2025-12-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.364312 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.364358 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.364372 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.364391 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.364405 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:50Z","lastTransitionTime":"2025-12-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.467067 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.467119 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.467128 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.467146 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.467161 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:50Z","lastTransitionTime":"2025-12-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.571008 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.571116 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.571144 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.571185 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.571229 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:50Z","lastTransitionTime":"2025-12-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.673879 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.673915 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.673927 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.673946 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.673958 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:50Z","lastTransitionTime":"2025-12-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.776170 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.776218 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.776231 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.776249 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.776262 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:50Z","lastTransitionTime":"2025-12-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.879664 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.879721 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.879738 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.879758 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.879773 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:50Z","lastTransitionTime":"2025-12-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.982665 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.982702 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.982713 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.982731 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:50 crc kubenswrapper[4935]: I1217 09:05:50.982743 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:50Z","lastTransitionTime":"2025-12-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.085630 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.085675 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.085686 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.085706 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.085717 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:51Z","lastTransitionTime":"2025-12-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.123223 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:51 crc kubenswrapper[4935]: E1217 09:05:51.123414 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.143046 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.160714 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.174574 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.188682 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.188752 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.188776 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.188800 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.188818 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:51Z","lastTransitionTime":"2025-12-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.188929 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.206881 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.225767 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.247737 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.263412 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.280491 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.291883 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.291937 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.291952 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.291977 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.291993 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:51Z","lastTransitionTime":"2025-12-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.294764 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.310232 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.330946 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:32Z\\\",\\\"message\\\":\\\"r *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1217 09:05:32.134741 6560 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1217 09:05:32.134758 6560 lb_config.go:1031] Cluster endpoints for openshift-controller-manager/controller-manager for network=default are: map[]\\\\nI1217 09:05:32.134779 6560 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134788 6560 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134761 6560 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:05:32.134797 6560 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1217 09:05:32.134833 6560 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1217 09:05:32.134881 6560 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.346334 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.361377 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.375648 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.387021 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.397029 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.397065 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.397077 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.397095 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.397109 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:51Z","lastTransitionTime":"2025-12-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.399768 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.501446 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.501861 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.501954 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.502053 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.502145 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:51Z","lastTransitionTime":"2025-12-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.605327 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.605376 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.605386 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.605410 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.605420 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:51Z","lastTransitionTime":"2025-12-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.616012 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/0.log" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.616089 4935 generic.go:334] "Generic (PLEG): container finished" podID="8b52811a-aff2-43c1-9074-f0654f991d9c" containerID="f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944" exitCode=1 Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.616149 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jrmtf" event={"ID":"8b52811a-aff2-43c1-9074-f0654f991d9c","Type":"ContainerDied","Data":"f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944"} Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.616877 4935 scope.go:117] "RemoveContainer" containerID="f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.632953 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.657795 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:32Z\\\",\\\"message\\\":\\\"r *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1217 09:05:32.134741 6560 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1217 09:05:32.134758 6560 lb_config.go:1031] Cluster endpoints for openshift-controller-manager/controller-manager for network=default are: map[]\\\\nI1217 09:05:32.134779 6560 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134788 6560 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134761 6560 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:05:32.134797 6560 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1217 09:05:32.134833 6560 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1217 09:05:32.134881 6560 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.674159 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.689126 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.699658 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.707676 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.707703 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.707718 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.707735 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.707747 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:51Z","lastTransitionTime":"2025-12-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.712574 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.728009 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.743145 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.755713 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.768021 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.782814 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:50Z\\\",\\\"message\\\":\\\"2025-12-17T09:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f\\\\n2025-12-17T09:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f to /host/opt/cni/bin/\\\\n2025-12-17T09:05:05Z [verbose] multus-daemon started\\\\n2025-12-17T09:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-17T09:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.797320 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.811040 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.811061 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.811216 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.811228 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.811249 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.811285 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:51Z","lastTransitionTime":"2025-12-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.825839 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.840042 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.853041 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.865838 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:51Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.914128 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.914179 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.914192 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.914215 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:51 crc kubenswrapper[4935]: I1217 09:05:51.914235 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:51Z","lastTransitionTime":"2025-12-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.016928 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.016982 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.016995 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.017016 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.017031 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:52Z","lastTransitionTime":"2025-12-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.120815 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.120885 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.120896 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.120918 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.120931 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:52Z","lastTransitionTime":"2025-12-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.123986 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:52 crc kubenswrapper[4935]: E1217 09:05:52.124177 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.124317 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.124341 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:52 crc kubenswrapper[4935]: E1217 09:05:52.124477 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:52 crc kubenswrapper[4935]: E1217 09:05:52.124552 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.223647 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.223700 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.223727 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.223753 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.223766 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:52Z","lastTransitionTime":"2025-12-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.326691 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.326743 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.326759 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.326782 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.326797 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:52Z","lastTransitionTime":"2025-12-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.429472 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.430258 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.430371 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.430462 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.431027 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:52Z","lastTransitionTime":"2025-12-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.534806 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.534851 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.534862 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.534882 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.534900 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:52Z","lastTransitionTime":"2025-12-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.627309 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/0.log" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.627376 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jrmtf" event={"ID":"8b52811a-aff2-43c1-9074-f0654f991d9c","Type":"ContainerStarted","Data":"f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.636774 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.636806 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.636814 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.636861 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.636871 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:52Z","lastTransitionTime":"2025-12-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.643728 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.669250 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:32Z\\\",\\\"message\\\":\\\"r *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1217 09:05:32.134741 6560 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1217 09:05:32.134758 6560 lb_config.go:1031] Cluster endpoints for openshift-controller-manager/controller-manager for network=default are: map[]\\\\nI1217 09:05:32.134779 6560 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134788 6560 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134761 6560 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:05:32.134797 6560 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1217 09:05:32.134833 6560 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1217 09:05:32.134881 6560 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.689159 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.702811 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.715775 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.729777 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.740403 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.740460 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.740477 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.740498 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.740514 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:52Z","lastTransitionTime":"2025-12-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.743027 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.759033 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.770808 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.784538 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.799694 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:50Z\\\",\\\"message\\\":\\\"2025-12-17T09:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f\\\\n2025-12-17T09:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f to /host/opt/cni/bin/\\\\n2025-12-17T09:05:05Z [verbose] multus-daemon started\\\\n2025-12-17T09:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-17T09:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.816349 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.832658 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.842882 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.842917 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.842931 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.842952 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.842971 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:52Z","lastTransitionTime":"2025-12-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.851361 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.867485 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.884298 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.899386 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:52Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.946303 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.946362 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.946374 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.946397 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:52 crc kubenswrapper[4935]: I1217 09:05:52.946410 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:52Z","lastTransitionTime":"2025-12-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.049354 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.049424 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.049441 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.049471 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.049493 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:53Z","lastTransitionTime":"2025-12-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.123518 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:53 crc kubenswrapper[4935]: E1217 09:05:53.123734 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.152226 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.152292 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.152310 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.152327 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.152341 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:53Z","lastTransitionTime":"2025-12-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.255202 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.255253 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.255299 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.255321 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.255333 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:53Z","lastTransitionTime":"2025-12-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.358338 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.358390 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.358399 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.358416 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.358429 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:53Z","lastTransitionTime":"2025-12-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.461140 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.461200 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.461216 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.461239 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.461256 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:53Z","lastTransitionTime":"2025-12-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.565358 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.565414 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.565433 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.565455 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.565469 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:53Z","lastTransitionTime":"2025-12-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.668238 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.668308 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.668322 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.668339 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.668350 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:53Z","lastTransitionTime":"2025-12-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.771981 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.772080 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.772102 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.772140 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.772161 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:53Z","lastTransitionTime":"2025-12-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.875121 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.875211 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.875232 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.875263 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.875324 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:53Z","lastTransitionTime":"2025-12-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.978819 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.978865 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.978876 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.978893 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:53 crc kubenswrapper[4935]: I1217 09:05:53.978905 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:53Z","lastTransitionTime":"2025-12-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.081651 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.081712 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.081726 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.081745 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.081759 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:54Z","lastTransitionTime":"2025-12-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.124147 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.124209 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.124240 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:54 crc kubenswrapper[4935]: E1217 09:05:54.124348 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:54 crc kubenswrapper[4935]: E1217 09:05:54.124464 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:54 crc kubenswrapper[4935]: E1217 09:05:54.124570 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.184498 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.184558 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.184571 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.184593 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.184608 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:54Z","lastTransitionTime":"2025-12-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.288242 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.288326 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.288340 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.288362 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.288379 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:54Z","lastTransitionTime":"2025-12-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.391672 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.391722 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.391734 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.391755 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.391771 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:54Z","lastTransitionTime":"2025-12-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.495191 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.495246 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.495266 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.495311 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.495325 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:54Z","lastTransitionTime":"2025-12-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.599462 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.599540 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.599556 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.599580 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.599600 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:54Z","lastTransitionTime":"2025-12-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.703705 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.703777 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.703791 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.703812 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.703824 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:54Z","lastTransitionTime":"2025-12-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.807307 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.807391 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.807414 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.807448 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.807469 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:54Z","lastTransitionTime":"2025-12-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.909699 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.909747 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.909760 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.909782 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:54 crc kubenswrapper[4935]: I1217 09:05:54.909800 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:54Z","lastTransitionTime":"2025-12-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.012975 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.013027 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.013040 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.013059 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.013072 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:55Z","lastTransitionTime":"2025-12-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.115844 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.116032 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.116047 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.116064 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.116074 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:55Z","lastTransitionTime":"2025-12-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.123671 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:55 crc kubenswrapper[4935]: E1217 09:05:55.123857 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.219680 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.220013 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.220023 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.220038 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.220048 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:55Z","lastTransitionTime":"2025-12-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.322948 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.323007 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.323019 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.323038 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.323052 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:55Z","lastTransitionTime":"2025-12-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.425377 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.425475 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.425507 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.425543 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.425565 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:55Z","lastTransitionTime":"2025-12-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.528225 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.528260 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.528287 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.528302 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.528312 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:55Z","lastTransitionTime":"2025-12-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.632312 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.632370 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.632382 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.632401 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.632413 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:55Z","lastTransitionTime":"2025-12-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.735983 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.736031 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.736042 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.736060 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.736072 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:55Z","lastTransitionTime":"2025-12-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.839114 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.839168 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.839186 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.839204 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.839217 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:55Z","lastTransitionTime":"2025-12-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.942659 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.942712 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.942730 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.942756 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:55 crc kubenswrapper[4935]: I1217 09:05:55.942774 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:55Z","lastTransitionTime":"2025-12-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.045060 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.045093 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.045101 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.045117 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.045127 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:56Z","lastTransitionTime":"2025-12-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.123630 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.123819 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:56 crc kubenswrapper[4935]: E1217 09:05:56.123893 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.123969 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:56 crc kubenswrapper[4935]: E1217 09:05:56.124134 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:56 crc kubenswrapper[4935]: E1217 09:05:56.124249 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.147978 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.148040 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.148055 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.148079 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.148098 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:56Z","lastTransitionTime":"2025-12-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.251071 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.251189 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.251232 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.251252 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.251268 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:56Z","lastTransitionTime":"2025-12-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.354781 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.354860 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.354878 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.354906 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.354926 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:56Z","lastTransitionTime":"2025-12-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.459208 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.459326 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.459344 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.459367 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.459387 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:56Z","lastTransitionTime":"2025-12-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.563196 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.563266 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.563324 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.563353 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.563375 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:56Z","lastTransitionTime":"2025-12-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.667103 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.667182 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.667215 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.667256 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.667342 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:56Z","lastTransitionTime":"2025-12-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.771249 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.771372 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.771395 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.771425 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.771449 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:56Z","lastTransitionTime":"2025-12-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.875643 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.875702 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.875712 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.875732 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.875746 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:56Z","lastTransitionTime":"2025-12-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.979747 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.979806 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.979815 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.979832 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:56 crc kubenswrapper[4935]: I1217 09:05:56.979842 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:56Z","lastTransitionTime":"2025-12-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.082560 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.082645 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.082667 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.082698 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.082719 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:57Z","lastTransitionTime":"2025-12-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.124243 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:57 crc kubenswrapper[4935]: E1217 09:05:57.124488 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.185668 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.185732 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.185745 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.185767 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.185782 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:57Z","lastTransitionTime":"2025-12-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.288354 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.288405 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.288443 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.288461 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.288474 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:57Z","lastTransitionTime":"2025-12-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.391671 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.391950 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.392013 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.392079 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.392145 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:57Z","lastTransitionTime":"2025-12-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.495512 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.495559 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.495573 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.495594 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.495608 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:57Z","lastTransitionTime":"2025-12-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.598133 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.598173 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.598182 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.598197 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.598206 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:57Z","lastTransitionTime":"2025-12-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.701359 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.701421 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.701431 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.701452 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.701462 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:57Z","lastTransitionTime":"2025-12-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.804914 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.804991 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.805014 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.805048 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.805072 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:57Z","lastTransitionTime":"2025-12-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.907939 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.907995 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.908005 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.908030 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:57 crc kubenswrapper[4935]: I1217 09:05:57.908043 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:57Z","lastTransitionTime":"2025-12-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.011403 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.011454 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.011463 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.011485 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.011497 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:58Z","lastTransitionTime":"2025-12-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.114448 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.114539 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.114563 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.114593 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.114617 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:58Z","lastTransitionTime":"2025-12-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.123669 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.123735 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.123699 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:05:58 crc kubenswrapper[4935]: E1217 09:05:58.123839 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:05:58 crc kubenswrapper[4935]: E1217 09:05:58.123950 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:05:58 crc kubenswrapper[4935]: E1217 09:05:58.124090 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.217902 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.217976 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.217998 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.218030 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.218052 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:58Z","lastTransitionTime":"2025-12-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.320990 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.321058 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.321071 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.321090 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.321101 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:58Z","lastTransitionTime":"2025-12-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.424539 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.424606 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.424628 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.424665 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.424692 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:58Z","lastTransitionTime":"2025-12-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.528495 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.528563 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.528580 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.528612 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.528630 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:58Z","lastTransitionTime":"2025-12-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.631344 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.631389 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.631400 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.631420 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.631434 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:58Z","lastTransitionTime":"2025-12-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.734803 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.734863 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.734875 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.734899 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.734911 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:58Z","lastTransitionTime":"2025-12-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.837480 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.837532 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.837546 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.837564 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.837576 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:58Z","lastTransitionTime":"2025-12-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.940656 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.940708 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.940721 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.940752 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:58 crc kubenswrapper[4935]: I1217 09:05:58.940768 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:58Z","lastTransitionTime":"2025-12-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.043626 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.043673 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.043684 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.043703 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.043716 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.123524 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:05:59 crc kubenswrapper[4935]: E1217 09:05:59.123684 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.146471 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.146552 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.146570 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.146589 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.146601 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.250005 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.250070 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.250087 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.250116 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.250137 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.353199 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.353251 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.353260 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.353308 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.353327 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.361762 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.361819 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.361837 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.361859 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.361875 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: E1217 09:05:59.381265 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:59Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.386054 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.386112 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.386130 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.386160 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.386179 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: E1217 09:05:59.407433 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:59Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.412759 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.412852 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.412876 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.412905 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.412924 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: E1217 09:05:59.427474 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:59Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.433259 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.433316 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.433327 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.433344 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.433355 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: E1217 09:05:59.443925 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:59Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.448528 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.448641 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.448708 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.448775 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.448850 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: E1217 09:05:59.462255 4935 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"e24588ce-27b5-4ae2-a4f8-11ff903735be\\\",\\\"systemUUID\\\":\\\"a5a48762-63f5-465e-baf7-279b31b6b014\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:05:59Z is after 2025-08-24T17:21:41Z" Dec 17 09:05:59 crc kubenswrapper[4935]: E1217 09:05:59.462686 4935 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.464692 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.464803 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.464877 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.464970 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.465057 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.567692 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.567729 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.567738 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.567753 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.567763 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.669933 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.669969 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.669978 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.669994 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.670005 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.772952 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.773007 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.773019 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.773038 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.773049 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.876881 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.876932 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.876946 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.876964 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.876976 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.980458 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.980546 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.980572 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.980602 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:05:59 crc kubenswrapper[4935]: I1217 09:05:59.980633 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:05:59Z","lastTransitionTime":"2025-12-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.084561 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.084655 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.084681 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.084716 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.084745 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:00Z","lastTransitionTime":"2025-12-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.123535 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.123526 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.123748 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:00 crc kubenswrapper[4935]: E1217 09:06:00.123963 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:00 crc kubenswrapper[4935]: E1217 09:06:00.124137 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:00 crc kubenswrapper[4935]: E1217 09:06:00.124353 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.125733 4935 scope.go:117] "RemoveContainer" containerID="ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.187808 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.188503 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.188549 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.188584 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.188600 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:00Z","lastTransitionTime":"2025-12-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.290829 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.290864 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.290873 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.290890 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.290901 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:00Z","lastTransitionTime":"2025-12-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.393684 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.393738 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.393753 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.393776 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.393792 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:00Z","lastTransitionTime":"2025-12-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.497103 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.497151 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.497162 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.497181 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.497193 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:00Z","lastTransitionTime":"2025-12-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.599860 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.599903 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.599912 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.599927 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.599939 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:00Z","lastTransitionTime":"2025-12-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.660598 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/2.log" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.663598 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.664166 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.686423 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.702401 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.702438 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.702453 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.702472 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.702484 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:00Z","lastTransitionTime":"2025-12-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.704401 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.718768 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.731374 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.744702 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.759598 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:50Z\\\",\\\"message\\\":\\\"2025-12-17T09:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f\\\\n2025-12-17T09:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f to /host/opt/cni/bin/\\\\n2025-12-17T09:05:05Z [verbose] multus-daemon started\\\\n2025-12-17T09:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-17T09:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.774006 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.791624 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.805542 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.805666 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.805678 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.805699 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.805711 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:00Z","lastTransitionTime":"2025-12-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.814413 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.836489 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.851292 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.871202 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:32Z\\\",\\\"message\\\":\\\"r *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1217 09:05:32.134741 6560 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1217 09:05:32.134758 6560 lb_config.go:1031] Cluster endpoints for openshift-controller-manager/controller-manager for network=default are: map[]\\\\nI1217 09:05:32.134779 6560 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134788 6560 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134761 6560 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:05:32.134797 6560 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1217 09:05:32.134833 6560 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1217 09:05:32.134881 6560 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.883712 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.900080 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.907677 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.907724 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.907736 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.907754 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.907764 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:00Z","lastTransitionTime":"2025-12-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.912375 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.924552 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:00 crc kubenswrapper[4935]: I1217 09:06:00.937117 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:00Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.010421 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.010466 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.010478 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.010497 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.010508 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:01Z","lastTransitionTime":"2025-12-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.113711 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.113777 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.113790 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.113813 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.113827 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:01Z","lastTransitionTime":"2025-12-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.123980 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:01 crc kubenswrapper[4935]: E1217 09:06:01.124142 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.136607 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.148138 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.160882 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:50Z\\\",\\\"message\\\":\\\"2025-12-17T09:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f\\\\n2025-12-17T09:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f to /host/opt/cni/bin/\\\\n2025-12-17T09:05:05Z [verbose] multus-daemon started\\\\n2025-12-17T09:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-17T09:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.174026 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.190053 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.203898 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.215776 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.215825 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.215833 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.215849 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.215859 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:01Z","lastTransitionTime":"2025-12-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.217587 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.233152 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.255605 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.270039 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.285775 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.306770 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:32Z\\\",\\\"message\\\":\\\"r *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1217 09:05:32.134741 6560 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1217 09:05:32.134758 6560 lb_config.go:1031] Cluster endpoints for openshift-controller-manager/controller-manager for network=default are: map[]\\\\nI1217 09:05:32.134779 6560 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134788 6560 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134761 6560 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:05:32.134797 6560 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1217 09:05:32.134833 6560 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1217 09:05:32.134881 6560 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.318526 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.318571 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.318586 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.318608 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.318622 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:01Z","lastTransitionTime":"2025-12-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.322731 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.338265 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.352460 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.369497 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.389241 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.421375 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.421449 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.421462 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.421481 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.421493 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:01Z","lastTransitionTime":"2025-12-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.525460 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.525572 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.525610 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.525657 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.525674 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:01Z","lastTransitionTime":"2025-12-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.629171 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.629218 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.629228 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.629248 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.629262 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:01Z","lastTransitionTime":"2025-12-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.669209 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/3.log" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.669834 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/2.log" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.673423 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" exitCode=1 Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.673480 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319"} Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.673533 4935 scope.go:117] "RemoveContainer" containerID="ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.674642 4935 scope.go:117] "RemoveContainer" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:06:01 crc kubenswrapper[4935]: E1217 09:06:01.674867 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.691468 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.703708 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.717049 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.731996 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.732024 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.732032 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.732050 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.732061 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:01Z","lastTransitionTime":"2025-12-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.735218 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ee3749ff2c0637b2748f6967427c9dcd8349565a90bf6a27b054a169acc36520\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:32Z\\\",\\\"message\\\":\\\"r *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc after 0 failed attempt(s)\\\\nI1217 09:05:32.134741 6560 default_network_controller.go:776] Recording success event on pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nI1217 09:05:32.134758 6560 lb_config.go:1031] Cluster endpoints for openshift-controller-manager/controller-manager for network=default are: map[]\\\\nI1217 09:05:32.134779 6560 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134788 6560 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI1217 09:05:32.134761 6560 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:05:32.134797 6560 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI1217 09:05:32.134833 6560 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF1217 09:05:32.134881 6560 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:06:01Z\\\",\\\"message\\\":\\\"01.061117 6952 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-bw8z8\\\\nI1217 09:06:01.061422 6952 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1217 09:06:01.061423 6952 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:06:01.061396 6952 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1217 09:06:01.061524 6952 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.54824ms\\\\nI1217 09:06:01.061538 6952 services_controller.go:356] Processing sync for service openshift-dns-operator/metrics for network=default\\\\nI1217 09:06:01.061077 6952 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF1217 09:06:01.061582 6952 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:06:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.750539 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.763966 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.774804 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.790910 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.806236 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.821897 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.833860 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.834844 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.834888 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.834898 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.834917 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.834930 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:01Z","lastTransitionTime":"2025-12-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.845648 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.861461 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:50Z\\\",\\\"message\\\":\\\"2025-12-17T09:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f\\\\n2025-12-17T09:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f to /host/opt/cni/bin/\\\\n2025-12-17T09:05:05Z [verbose] multus-daemon started\\\\n2025-12-17T09:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-17T09:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.879872 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.893553 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.909917 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.925038 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:01Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.938199 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.938254 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.938267 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.938309 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:01 crc kubenswrapper[4935]: I1217 09:06:01.938322 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:01Z","lastTransitionTime":"2025-12-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.041184 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.041321 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.041349 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.041383 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.041406 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:02Z","lastTransitionTime":"2025-12-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.123246 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.123249 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:02 crc kubenswrapper[4935]: E1217 09:06:02.123507 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.123307 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:02 crc kubenswrapper[4935]: E1217 09:06:02.123561 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:02 crc kubenswrapper[4935]: E1217 09:06:02.123767 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.144661 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.144742 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.144756 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.144951 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.144970 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:02Z","lastTransitionTime":"2025-12-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.247363 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.247427 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.247440 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.247461 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.247476 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:02Z","lastTransitionTime":"2025-12-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.350792 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.350832 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.350841 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.350855 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.350865 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:02Z","lastTransitionTime":"2025-12-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.453476 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.453511 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.453544 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.453559 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.453570 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:02Z","lastTransitionTime":"2025-12-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.556069 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.556113 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.556122 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.556143 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.556154 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:02Z","lastTransitionTime":"2025-12-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.659128 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.659177 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.659192 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.659211 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.659226 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:02Z","lastTransitionTime":"2025-12-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.678692 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/3.log" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.683797 4935 scope.go:117] "RemoveContainer" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:06:02 crc kubenswrapper[4935]: E1217 09:06:02.684050 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.701522 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.727212 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"969f53bb-09fc-4577-8f7c-dc6ca1679add\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:06:01Z\\\",\\\"message\\\":\\\"01.061117 6952 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver-bw8z8\\\\nI1217 09:06:01.061422 6952 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}\\\\nI1217 09:06:01.061423 6952 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI1217 09:06:01.061396 6952 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI1217 09:06:01.061524 6952 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.54824ms\\\\nI1217 09:06:01.061538 6952 services_controller.go:356] Processing sync for service openshift-dns-operator/metrics for network=default\\\\nI1217 09:06:01.061077 6952 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-target-xd92c] creating logical port openshift-network-diagnostics_network-check-target-xd92c for pod on switch crc\\\\nF1217 09:06:01.061582 6952 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin networ\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:06:00Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8ftrx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-rwwd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.745891 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"44bcbaec-1004-4feb-88ca-4fb1aeeb7c73\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff42e52d459529cc24d364deb40b4905a42a9ae17afd7536c793a72f1162ac7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4611a95659ea644e6ad7f3dd36aeb7672845500039d7ea6d0cfdac472bde9b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b04d5aa7c39975999495ef1d8bd1136997efc9c011892743e7b5c29e6558c32\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7c13d9c8966a8a603834fa1d32286cd19111ddf0e0af4ebe1b98ba389817a78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://77e25b9fa43e841fe9b9070cf065b00b56ec3d767f41a05f0afbc5c62ba36eec\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dbf76d9ccfcc7ad6135b205ac143d47bd8685ba9f02e6b864e1ce669f844eb04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://865d58826ef5e3528647a78b90cede9b40285934a5bb80e43574e9b24e5d08ff\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:05:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vf6lc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qzmn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.761118 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8a1fb087-2513-44cc-8dfd-e9879b0e840c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebe3ae4ab3b134bf17d4eadf08b598368b32b62d3dedb68ecbb48cb351ed5bef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6283498c111900896b573be092a829bbf95c8c7e501a7aebe4154f30a740b11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftd9l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:16Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sh5rm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.762713 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.762772 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.762792 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.762842 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.762870 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:02Z","lastTransitionTime":"2025-12-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.775764 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77feddc8-547a-42a0-baa3-19dd2915eb9f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dhr4p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:17Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rg2z5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.790775 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1a317859-ffd5-46d3-8463-aec0e6b70b62\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad351b0d0c797e9b94a699a29a7007d2182f43ab0a8d66a81fc8f274adbb3709\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2ed71a9402e474e3e44482e142c85b45a3d63af4df1184096c63d322d019e97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51fadc3abee97e9f4a2a63b56da9cfccf00da479eea356973385220f0f2f10a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://89a4dbd9c35f367dc5314c9705ed6bb4fe2989fb4f4a2d5bab430e5348bc6814\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.809741 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59cb7cc0b66a0372e01711bad8d372a91480dd982025f8c3130d830d431b5d2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.824448 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.838188 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-bw8z8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a33bf5c1-c1b1-4f4d-afdf-7b8bd74c8339\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba9de1e9463c10ecde675f2333a3c5a27a94d416051b7fee1e94eab90c886ea2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9lfmh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-bw8z8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.850145 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-n6z48" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c716f0c7-850f-4cc4-bd28-5a2807f126a3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://46a46501887366a907dda713462ff0f1145967310290c9acf4b37e00694d326c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mvmjl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:03Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-n6z48\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.863728 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-jrmtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b52811a-aff2-43c1-9074-f0654f991d9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-17T09:05:50Z\\\",\\\"message\\\":\\\"2025-12-17T09:05:05+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f\\\\n2025-12-17T09:05:05+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_6b697acd-7753-4f8f-8a79-d0acfe502e1f to /host/opt/cni/bin/\\\\n2025-12-17T09:05:05Z [verbose] multus-daemon started\\\\n2025-12-17T09:05:05Z [verbose] Readiness Indicator file check\\\\n2025-12-17T09:05:50Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqxq4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-multus\"/\"multus-jrmtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.865372 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.865437 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.865454 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.865494 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.865511 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:02Z","lastTransitionTime":"2025-12-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.878832 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1205f316-e5a6-43e4-a4b1-068b0fce9066\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://31f4a6c26f6eab10aee1fba7e19728f6e10eecb9c12c904c39c751f806e6d8a0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://704e155ee0b290dc7b18d91aa899a8eb04e6aa85a9c569e82e493c98641ec8ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb7759440a0317891474430033ee00528c75b6402b5ee7d69d88358897b77915\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.894261 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f6913a1bf7c2aa13a97e9277a2417399d1e91d53946f0925d488133fee5705b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.908913 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.923713 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:03Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a81155e384a88d1dfe28a5d61de6014bf1a4ba675e7f5e7fe1f05ed56cbd688b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3149a3db7b18b74d37c8a85c6e87123c04af4e9c61ded5ebd0ee8febcd4ed38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.943140 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"af86d1aa-14d6-4f22-9459-2dfffc50d347\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:04:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-17T09:04:46Z\\\",\\\"message\\\":\\\"W1217 09:04:45.275238 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1217 09:04:45.276178 1 crypto.go:601] Generating new CA for check-endpoints-signer@1765962285 cert, and key in /tmp/serving-cert-3351416762/serving-signer.crt, /tmp/serving-cert-3351416762/serving-signer.key\\\\nI1217 09:04:45.807146 1 observer_polling.go:159] Starting file observer\\\\nW1217 09:04:45.810604 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1217 09:04:45.810909 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1217 09:04:45.812266 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3351416762/tls.crt::/tmp/serving-cert-3351416762/tls.key\\\\\\\"\\\\nF1217 09:04:46.444285 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:45Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:04:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-17T09:04:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-17T09:04:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:04:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.959397 4935 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6d8b2226-e518-487d-967a-78cbfd4da1dc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-17T09:05:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://937885fde500234ca778bb73e26b683d3305b2d3046ae7375ec692353105f0b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-17T09:05:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l52zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-17T09:05:04Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-k7lhw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-17T09:06:02Z is after 2025-08-24T17:21:41Z" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.968619 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.968723 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.968738 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.968762 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:02 crc kubenswrapper[4935]: I1217 09:06:02.968777 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:02Z","lastTransitionTime":"2025-12-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.072200 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.072254 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.072267 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.072304 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.072318 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:03Z","lastTransitionTime":"2025-12-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.123473 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:03 crc kubenswrapper[4935]: E1217 09:06:03.123726 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.175291 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.175335 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.175348 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.175369 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.175387 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:03Z","lastTransitionTime":"2025-12-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.278949 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.279029 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.279046 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.279076 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.279099 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:03Z","lastTransitionTime":"2025-12-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.382700 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.382753 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.382765 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.382788 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.382805 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:03Z","lastTransitionTime":"2025-12-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.485836 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.485939 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.485973 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.486007 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.486032 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:03Z","lastTransitionTime":"2025-12-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.590016 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.590058 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.590066 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.590083 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.590094 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:03Z","lastTransitionTime":"2025-12-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.692417 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.693042 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.693061 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.693087 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.693106 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:03Z","lastTransitionTime":"2025-12-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.795040 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.795082 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.795091 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.795106 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.795118 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:03Z","lastTransitionTime":"2025-12-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.897985 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.898041 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.898057 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.898083 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:03 crc kubenswrapper[4935]: I1217 09:06:03.898098 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:03Z","lastTransitionTime":"2025-12-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.000870 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.000918 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.000932 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.000951 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.000965 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:04Z","lastTransitionTime":"2025-12-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.104228 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.104352 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.104371 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.104399 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.104417 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:04Z","lastTransitionTime":"2025-12-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.123260 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.123321 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.123329 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:04 crc kubenswrapper[4935]: E1217 09:06:04.123503 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:04 crc kubenswrapper[4935]: E1217 09:06:04.123651 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:04 crc kubenswrapper[4935]: E1217 09:06:04.123801 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.207171 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.207288 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.207312 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.207334 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.207348 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:04Z","lastTransitionTime":"2025-12-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.310144 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.310636 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.310736 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.310872 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.310961 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:04Z","lastTransitionTime":"2025-12-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.415892 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.415962 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.415984 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.416010 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.416033 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:04Z","lastTransitionTime":"2025-12-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.519461 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.519506 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.519515 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.519531 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.519542 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:04Z","lastTransitionTime":"2025-12-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.622289 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.622326 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.622335 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.622351 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.622360 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:04Z","lastTransitionTime":"2025-12-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.724517 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.724596 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.724606 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.724625 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.724635 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:04Z","lastTransitionTime":"2025-12-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.828495 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.828602 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.828614 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.828636 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.828647 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:04Z","lastTransitionTime":"2025-12-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.932030 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.932073 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.932081 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.932099 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:04 crc kubenswrapper[4935]: I1217 09:06:04.932109 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:04Z","lastTransitionTime":"2025-12-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.034815 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.034879 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.034896 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.034926 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.034946 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:05Z","lastTransitionTime":"2025-12-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.124226 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:05 crc kubenswrapper[4935]: E1217 09:06:05.125044 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.137725 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.137779 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.137795 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.137816 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.137834 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:05Z","lastTransitionTime":"2025-12-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.144145 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.241190 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.241226 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.241237 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.241252 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.241262 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:05Z","lastTransitionTime":"2025-12-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.344297 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.344347 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.344359 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.344378 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.344393 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:05Z","lastTransitionTime":"2025-12-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.448005 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.448083 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.448102 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.448134 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.448156 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:05Z","lastTransitionTime":"2025-12-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.551624 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.551710 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.551724 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.551769 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.551784 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:05Z","lastTransitionTime":"2025-12-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.654964 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.655031 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.655043 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.655068 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.655083 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:05Z","lastTransitionTime":"2025-12-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.758157 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.758212 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.758224 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.758249 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.758262 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:05Z","lastTransitionTime":"2025-12-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.862000 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.862071 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.862098 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.862131 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.862154 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:05Z","lastTransitionTime":"2025-12-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.964618 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.964674 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.964685 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.964708 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.964719 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:05Z","lastTransitionTime":"2025-12-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:05 crc kubenswrapper[4935]: I1217 09:06:05.967711 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:06:05 crc kubenswrapper[4935]: E1217 09:06:05.968053 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.968004577 +0000 UTC m=+149.627845350 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.068148 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.068210 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.068222 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.068242 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.068254 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:06Z","lastTransitionTime":"2025-12-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.068717 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.068808 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.068845 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.068876 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069017 4935 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069144 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069171 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069184 4935 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069187 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069030 4935 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069236 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.069194884 +0000 UTC m=+149.729035727 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069242 4935 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069542 4935 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069442 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.069386358 +0000 UTC m=+149.729227151 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069710 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.069681786 +0000 UTC m=+149.729522609 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.069741 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.069728777 +0000 UTC m=+149.729569570 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.123409 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.123552 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.123580 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.123439 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.123749 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:06 crc kubenswrapper[4935]: E1217 09:06:06.124133 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.171387 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.171420 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.171428 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.171445 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.171456 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:06Z","lastTransitionTime":"2025-12-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.274213 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.274256 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.274265 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.274298 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.274309 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:06Z","lastTransitionTime":"2025-12-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.377365 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.377455 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.377482 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.377516 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.377538 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:06Z","lastTransitionTime":"2025-12-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.480247 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.480303 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.480312 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.480328 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.480338 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:06Z","lastTransitionTime":"2025-12-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.583740 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.583813 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.583832 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.583857 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.583874 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:06Z","lastTransitionTime":"2025-12-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.686571 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.686619 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.686631 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.686647 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.686657 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:06Z","lastTransitionTime":"2025-12-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.789995 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.790042 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.790052 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.790071 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.790087 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:06Z","lastTransitionTime":"2025-12-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.892508 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.892559 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.892572 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.892592 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.892608 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:06Z","lastTransitionTime":"2025-12-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.995141 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.995220 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.995231 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.995246 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:06 crc kubenswrapper[4935]: I1217 09:06:06.995256 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:06Z","lastTransitionTime":"2025-12-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.098964 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.099035 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.099053 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.099080 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.099137 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:07Z","lastTransitionTime":"2025-12-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.124386 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:07 crc kubenswrapper[4935]: E1217 09:06:07.124528 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.202177 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.202227 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.202240 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.202259 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.202298 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:07Z","lastTransitionTime":"2025-12-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.306170 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.306297 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.306321 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.306346 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.306365 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:07Z","lastTransitionTime":"2025-12-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.409683 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.409745 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.409761 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.409785 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.409803 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:07Z","lastTransitionTime":"2025-12-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.514065 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.514131 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.514151 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.514184 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.514205 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:07Z","lastTransitionTime":"2025-12-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.618001 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.618106 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.618126 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.618153 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.618171 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:07Z","lastTransitionTime":"2025-12-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.721291 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.721350 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.721361 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.721384 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.721397 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:07Z","lastTransitionTime":"2025-12-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.825439 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.825520 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.825544 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.825583 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.825609 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:07Z","lastTransitionTime":"2025-12-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.929163 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.929204 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.929213 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.929230 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:07 crc kubenswrapper[4935]: I1217 09:06:07.929240 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:07Z","lastTransitionTime":"2025-12-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.032920 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.032976 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.032992 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.033014 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.033031 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:08Z","lastTransitionTime":"2025-12-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.123386 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.123432 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.123424 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:08 crc kubenswrapper[4935]: E1217 09:06:08.123626 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:08 crc kubenswrapper[4935]: E1217 09:06:08.123711 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:08 crc kubenswrapper[4935]: E1217 09:06:08.123841 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.135683 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.135712 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.135720 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.135734 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.135744 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:08Z","lastTransitionTime":"2025-12-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.237863 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.237903 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.237913 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.237929 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.237938 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:08Z","lastTransitionTime":"2025-12-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.340486 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.340564 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.340583 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.340615 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.340636 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:08Z","lastTransitionTime":"2025-12-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.444561 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.444618 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.444627 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.444645 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.444658 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:08Z","lastTransitionTime":"2025-12-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.547534 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.547583 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.547593 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.547609 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.547623 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:08Z","lastTransitionTime":"2025-12-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.651714 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.651793 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.651812 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.651883 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.651905 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:08Z","lastTransitionTime":"2025-12-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.756148 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.756229 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.756263 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.756309 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.756345 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:08Z","lastTransitionTime":"2025-12-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.859234 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.859299 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.859311 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.859328 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.859338 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:08Z","lastTransitionTime":"2025-12-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.962438 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.963221 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.963259 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.963306 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:08 crc kubenswrapper[4935]: I1217 09:06:08.963321 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:08Z","lastTransitionTime":"2025-12-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.066922 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.067024 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.067040 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.067060 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.067074 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:09Z","lastTransitionTime":"2025-12-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.124192 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:09 crc kubenswrapper[4935]: E1217 09:06:09.125011 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.169932 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.169996 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.170012 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.170036 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.170051 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:09Z","lastTransitionTime":"2025-12-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.273665 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.273762 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.273781 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.273809 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.273830 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:09Z","lastTransitionTime":"2025-12-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.376937 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.377027 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.377053 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.377084 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.377108 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:09Z","lastTransitionTime":"2025-12-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.481043 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.481090 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.481103 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.481125 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.481136 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:09Z","lastTransitionTime":"2025-12-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.584333 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.584393 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.584409 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.584429 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.584442 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:09Z","lastTransitionTime":"2025-12-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.612901 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.612977 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.612996 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.613022 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.613042 4935 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-17T09:06:09Z","lastTransitionTime":"2025-12-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.679574 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn"] Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.680436 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.684119 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.684650 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.686713 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.687480 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.768226 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=4.768194704 podStartE2EDuration="4.768194704s" podCreationTimestamp="2025-12-17 09:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:09.754329669 +0000 UTC m=+89.414170472" watchObservedRunningTime="2025-12-17 09:06:09.768194704 +0000 UTC m=+89.428035477" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.783301 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.78325514 podStartE2EDuration="38.78325514s" podCreationTimestamp="2025-12-17 09:05:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:09.768497022 +0000 UTC m=+89.428337805" watchObservedRunningTime="2025-12-17 09:06:09.78325514 +0000 UTC m=+89.443095913" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.805514 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qzmn2" podStartSLOduration=66.80549102 podStartE2EDuration="1m6.80549102s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:09.804953557 +0000 UTC m=+89.464794330" watchObservedRunningTime="2025-12-17 09:06:09.80549102 +0000 UTC m=+89.465331783" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.813681 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1305e728-66b1-4add-abf6-b01e9c17b61a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.813756 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1305e728-66b1-4add-abf6-b01e9c17b61a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.813782 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1305e728-66b1-4add-abf6-b01e9c17b61a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.813801 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1305e728-66b1-4add-abf6-b01e9c17b61a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.813986 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1305e728-66b1-4add-abf6-b01e9c17b61a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.853681 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sh5rm" podStartSLOduration=65.853655477 podStartE2EDuration="1m5.853655477s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:09.828572693 +0000 UTC m=+89.488413456" watchObservedRunningTime="2025-12-17 09:06:09.853655477 +0000 UTC m=+89.513496250" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.875510 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=65.875488226 podStartE2EDuration="1m5.875488226s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:09.874984634 +0000 UTC m=+89.534825417" watchObservedRunningTime="2025-12-17 09:06:09.875488226 +0000 UTC m=+89.535328989" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.914597 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1305e728-66b1-4add-abf6-b01e9c17b61a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.914640 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1305e728-66b1-4add-abf6-b01e9c17b61a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.914656 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1305e728-66b1-4add-abf6-b01e9c17b61a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.914707 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1305e728-66b1-4add-abf6-b01e9c17b61a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.914737 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1305e728-66b1-4add-abf6-b01e9c17b61a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.914787 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/1305e728-66b1-4add-abf6-b01e9c17b61a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.914855 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/1305e728-66b1-4add-abf6-b01e9c17b61a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.915831 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1305e728-66b1-4add-abf6-b01e9c17b61a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.921687 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1305e728-66b1-4add-abf6-b01e9c17b61a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.930958 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1305e728-66b1-4add-abf6-b01e9c17b61a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d57fn\" (UID: \"1305e728-66b1-4add-abf6-b01e9c17b61a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.973876 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bw8z8" podStartSLOduration=66.97385562 podStartE2EDuration="1m6.97385562s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:09.972746302 +0000 UTC m=+89.632587065" watchObservedRunningTime="2025-12-17 09:06:09.97385562 +0000 UTC m=+89.633696383" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.982826 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n6z48" podStartSLOduration=66.98279636 podStartE2EDuration="1m6.98279636s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:09.982371889 +0000 UTC m=+89.642212662" watchObservedRunningTime="2025-12-17 09:06:09.98279636 +0000 UTC m=+89.642637133" Dec 17 09:06:09 crc kubenswrapper[4935]: I1217 09:06:09.998984 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" Dec 17 09:06:10 crc kubenswrapper[4935]: I1217 09:06:10.000149 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jrmtf" podStartSLOduration=67.000110583 podStartE2EDuration="1m7.000110583s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:09.999253622 +0000 UTC m=+89.659094395" watchObservedRunningTime="2025-12-17 09:06:10.000110583 +0000 UTC m=+89.659951387" Dec 17 09:06:10 crc kubenswrapper[4935]: W1217 09:06:10.016691 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1305e728_66b1_4add_abf6_b01e9c17b61a.slice/crio-53e9dd90c6a8d2aed1e5585f982a3df2fee6d7560f96d2a348285993bae4ce11 WatchSource:0}: Error finding container 53e9dd90c6a8d2aed1e5585f982a3df2fee6d7560f96d2a348285993bae4ce11: Status 404 returned error can't find the container with id 53e9dd90c6a8d2aed1e5585f982a3df2fee6d7560f96d2a348285993bae4ce11 Dec 17 09:06:10 crc kubenswrapper[4935]: I1217 09:06:10.021108 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=69.021079742 podStartE2EDuration="1m9.021079742s" podCreationTimestamp="2025-12-17 09:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:10.018643339 +0000 UTC m=+89.678484102" watchObservedRunningTime="2025-12-17 09:06:10.021079742 +0000 UTC m=+89.680920505" Dec 17 09:06:10 crc kubenswrapper[4935]: I1217 09:06:10.037716 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podStartSLOduration=67.037682548 podStartE2EDuration="1m7.037682548s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:10.03696954 +0000 UTC m=+89.696810313" watchObservedRunningTime="2025-12-17 09:06:10.037682548 +0000 UTC m=+89.697523321" Dec 17 09:06:10 crc kubenswrapper[4935]: I1217 09:06:10.123559 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:10 crc kubenswrapper[4935]: I1217 09:06:10.123596 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:10 crc kubenswrapper[4935]: E1217 09:06:10.123717 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:10 crc kubenswrapper[4935]: I1217 09:06:10.123787 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:10 crc kubenswrapper[4935]: E1217 09:06:10.123860 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:10 crc kubenswrapper[4935]: E1217 09:06:10.123928 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:10 crc kubenswrapper[4935]: I1217 09:06:10.719670 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" event={"ID":"1305e728-66b1-4add-abf6-b01e9c17b61a","Type":"ContainerStarted","Data":"08315ff3acf37b84047a90bd49bc5742554dd9575b5c36a9fc6624ca07dc94b6"} Dec 17 09:06:10 crc kubenswrapper[4935]: I1217 09:06:10.719767 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" event={"ID":"1305e728-66b1-4add-abf6-b01e9c17b61a","Type":"ContainerStarted","Data":"53e9dd90c6a8d2aed1e5585f982a3df2fee6d7560f96d2a348285993bae4ce11"} Dec 17 09:06:10 crc kubenswrapper[4935]: I1217 09:06:10.738751 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d57fn" podStartSLOduration=67.738719163 podStartE2EDuration="1m7.738719163s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:10.738442396 +0000 UTC m=+90.398283169" watchObservedRunningTime="2025-12-17 09:06:10.738719163 +0000 UTC m=+90.398559946" Dec 17 09:06:11 crc kubenswrapper[4935]: I1217 09:06:11.123644 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:11 crc kubenswrapper[4935]: E1217 09:06:11.124937 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:12 crc kubenswrapper[4935]: I1217 09:06:12.123306 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:12 crc kubenswrapper[4935]: I1217 09:06:12.123396 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:12 crc kubenswrapper[4935]: I1217 09:06:12.123432 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:12 crc kubenswrapper[4935]: E1217 09:06:12.123710 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:12 crc kubenswrapper[4935]: E1217 09:06:12.123802 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:12 crc kubenswrapper[4935]: E1217 09:06:12.123602 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:13 crc kubenswrapper[4935]: I1217 09:06:13.124026 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:13 crc kubenswrapper[4935]: E1217 09:06:13.124233 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:14 crc kubenswrapper[4935]: I1217 09:06:14.124120 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:14 crc kubenswrapper[4935]: I1217 09:06:14.124188 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:14 crc kubenswrapper[4935]: I1217 09:06:14.124216 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:14 crc kubenswrapper[4935]: E1217 09:06:14.124415 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:14 crc kubenswrapper[4935]: E1217 09:06:14.124506 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:14 crc kubenswrapper[4935]: E1217 09:06:14.124580 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:14 crc kubenswrapper[4935]: I1217 09:06:14.137706 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 17 09:06:15 crc kubenswrapper[4935]: I1217 09:06:15.123923 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:15 crc kubenswrapper[4935]: E1217 09:06:15.124237 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:16 crc kubenswrapper[4935]: I1217 09:06:16.123844 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:16 crc kubenswrapper[4935]: I1217 09:06:16.123935 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:16 crc kubenswrapper[4935]: I1217 09:06:16.124136 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:16 crc kubenswrapper[4935]: E1217 09:06:16.124249 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:16 crc kubenswrapper[4935]: E1217 09:06:16.124449 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:16 crc kubenswrapper[4935]: E1217 09:06:16.124646 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:17 crc kubenswrapper[4935]: I1217 09:06:17.123824 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:17 crc kubenswrapper[4935]: E1217 09:06:17.124472 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:18 crc kubenswrapper[4935]: I1217 09:06:18.123434 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:18 crc kubenswrapper[4935]: I1217 09:06:18.123474 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:18 crc kubenswrapper[4935]: E1217 09:06:18.123606 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:18 crc kubenswrapper[4935]: I1217 09:06:18.124189 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:18 crc kubenswrapper[4935]: E1217 09:06:18.124266 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:18 crc kubenswrapper[4935]: E1217 09:06:18.124434 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:18 crc kubenswrapper[4935]: I1217 09:06:18.125119 4935 scope.go:117] "RemoveContainer" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:06:18 crc kubenswrapper[4935]: E1217 09:06:18.125528 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" Dec 17 09:06:19 crc kubenswrapper[4935]: I1217 09:06:19.123622 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:19 crc kubenswrapper[4935]: E1217 09:06:19.125526 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:20 crc kubenswrapper[4935]: I1217 09:06:20.123549 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:20 crc kubenswrapper[4935]: I1217 09:06:20.123649 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:20 crc kubenswrapper[4935]: I1217 09:06:20.123551 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:20 crc kubenswrapper[4935]: E1217 09:06:20.123769 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:20 crc kubenswrapper[4935]: E1217 09:06:20.124011 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:20 crc kubenswrapper[4935]: E1217 09:06:20.124050 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:21 crc kubenswrapper[4935]: I1217 09:06:21.125473 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:21 crc kubenswrapper[4935]: E1217 09:06:21.126211 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:21 crc kubenswrapper[4935]: I1217 09:06:21.145944 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.145923185 podStartE2EDuration="7.145923185s" podCreationTimestamp="2025-12-17 09:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:21.143977075 +0000 UTC m=+100.803817868" watchObservedRunningTime="2025-12-17 09:06:21.145923185 +0000 UTC m=+100.805763948" Dec 17 09:06:21 crc kubenswrapper[4935]: I1217 09:06:21.663872 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:21 crc kubenswrapper[4935]: E1217 09:06:21.664092 4935 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:06:21 crc kubenswrapper[4935]: E1217 09:06:21.664230 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs podName:77feddc8-547a-42a0-baa3-19dd2915eb9f nodeName:}" failed. No retries permitted until 2025-12-17 09:07:25.664198291 +0000 UTC m=+165.324039064 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs") pod "network-metrics-daemon-rg2z5" (UID: "77feddc8-547a-42a0-baa3-19dd2915eb9f") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 17 09:06:22 crc kubenswrapper[4935]: I1217 09:06:22.123165 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:22 crc kubenswrapper[4935]: I1217 09:06:22.123165 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:22 crc kubenswrapper[4935]: I1217 09:06:22.123180 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:22 crc kubenswrapper[4935]: E1217 09:06:22.123517 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:22 crc kubenswrapper[4935]: E1217 09:06:22.123612 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:22 crc kubenswrapper[4935]: E1217 09:06:22.123837 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:23 crc kubenswrapper[4935]: I1217 09:06:23.124030 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:23 crc kubenswrapper[4935]: E1217 09:06:23.124294 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:24 crc kubenswrapper[4935]: I1217 09:06:24.123858 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:24 crc kubenswrapper[4935]: I1217 09:06:24.123888 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:24 crc kubenswrapper[4935]: I1217 09:06:24.124018 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:24 crc kubenswrapper[4935]: E1217 09:06:24.124077 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:24 crc kubenswrapper[4935]: E1217 09:06:24.124266 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:24 crc kubenswrapper[4935]: E1217 09:06:24.124541 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:25 crc kubenswrapper[4935]: I1217 09:06:25.123912 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:25 crc kubenswrapper[4935]: E1217 09:06:25.124651 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:26 crc kubenswrapper[4935]: I1217 09:06:26.123536 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:26 crc kubenswrapper[4935]: I1217 09:06:26.123635 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:26 crc kubenswrapper[4935]: I1217 09:06:26.123552 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:26 crc kubenswrapper[4935]: E1217 09:06:26.123920 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:26 crc kubenswrapper[4935]: E1217 09:06:26.124085 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:26 crc kubenswrapper[4935]: E1217 09:06:26.123863 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:27 crc kubenswrapper[4935]: I1217 09:06:27.123772 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:27 crc kubenswrapper[4935]: E1217 09:06:27.124016 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:28 crc kubenswrapper[4935]: I1217 09:06:28.123772 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:28 crc kubenswrapper[4935]: I1217 09:06:28.123792 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:28 crc kubenswrapper[4935]: I1217 09:06:28.123881 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:28 crc kubenswrapper[4935]: E1217 09:06:28.124245 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:28 crc kubenswrapper[4935]: E1217 09:06:28.124397 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:28 crc kubenswrapper[4935]: E1217 09:06:28.124524 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:29 crc kubenswrapper[4935]: I1217 09:06:29.123728 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:29 crc kubenswrapper[4935]: E1217 09:06:29.123968 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:30 crc kubenswrapper[4935]: I1217 09:06:30.123070 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:30 crc kubenswrapper[4935]: E1217 09:06:30.123198 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:30 crc kubenswrapper[4935]: I1217 09:06:30.123062 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:30 crc kubenswrapper[4935]: I1217 09:06:30.123382 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:30 crc kubenswrapper[4935]: E1217 09:06:30.123568 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:30 crc kubenswrapper[4935]: E1217 09:06:30.123738 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:30 crc kubenswrapper[4935]: I1217 09:06:30.125138 4935 scope.go:117] "RemoveContainer" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:06:30 crc kubenswrapper[4935]: E1217 09:06:30.125509 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-rwwd4_openshift-ovn-kubernetes(969f53bb-09fc-4577-8f7c-dc6ca1679add)\"" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" Dec 17 09:06:31 crc kubenswrapper[4935]: I1217 09:06:31.123209 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:31 crc kubenswrapper[4935]: E1217 09:06:31.125682 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:32 crc kubenswrapper[4935]: I1217 09:06:32.124043 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:32 crc kubenswrapper[4935]: I1217 09:06:32.124196 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:32 crc kubenswrapper[4935]: E1217 09:06:32.124346 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:32 crc kubenswrapper[4935]: I1217 09:06:32.124043 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:32 crc kubenswrapper[4935]: E1217 09:06:32.124504 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:32 crc kubenswrapper[4935]: E1217 09:06:32.124663 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:33 crc kubenswrapper[4935]: I1217 09:06:33.123741 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:33 crc kubenswrapper[4935]: E1217 09:06:33.123877 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:34 crc kubenswrapper[4935]: I1217 09:06:34.123764 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:34 crc kubenswrapper[4935]: I1217 09:06:34.123860 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:34 crc kubenswrapper[4935]: I1217 09:06:34.123782 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:34 crc kubenswrapper[4935]: E1217 09:06:34.124133 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:34 crc kubenswrapper[4935]: E1217 09:06:34.124310 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:34 crc kubenswrapper[4935]: E1217 09:06:34.124539 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:35 crc kubenswrapper[4935]: I1217 09:06:35.123624 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:35 crc kubenswrapper[4935]: E1217 09:06:35.123884 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:36 crc kubenswrapper[4935]: I1217 09:06:36.123654 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:36 crc kubenswrapper[4935]: I1217 09:06:36.123654 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:36 crc kubenswrapper[4935]: I1217 09:06:36.123687 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:36 crc kubenswrapper[4935]: E1217 09:06:36.124439 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:36 crc kubenswrapper[4935]: E1217 09:06:36.124631 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:36 crc kubenswrapper[4935]: E1217 09:06:36.125138 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:37 crc kubenswrapper[4935]: I1217 09:06:37.123213 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:37 crc kubenswrapper[4935]: E1217 09:06:37.123799 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:37 crc kubenswrapper[4935]: I1217 09:06:37.824485 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/1.log" Dec 17 09:06:37 crc kubenswrapper[4935]: I1217 09:06:37.824953 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/0.log" Dec 17 09:06:37 crc kubenswrapper[4935]: I1217 09:06:37.824997 4935 generic.go:334] "Generic (PLEG): container finished" podID="8b52811a-aff2-43c1-9074-f0654f991d9c" containerID="f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a" exitCode=1 Dec 17 09:06:37 crc kubenswrapper[4935]: I1217 09:06:37.825033 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jrmtf" event={"ID":"8b52811a-aff2-43c1-9074-f0654f991d9c","Type":"ContainerDied","Data":"f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a"} Dec 17 09:06:37 crc kubenswrapper[4935]: I1217 09:06:37.825075 4935 scope.go:117] "RemoveContainer" containerID="f8e44ab36fdf436cbb6413ccaeef050b934ecb8936f4b22e5b57ef50fcecd944" Dec 17 09:06:37 crc kubenswrapper[4935]: I1217 09:06:37.826411 4935 scope.go:117] "RemoveContainer" containerID="f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a" Dec 17 09:06:37 crc kubenswrapper[4935]: E1217 09:06:37.826868 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-jrmtf_openshift-multus(8b52811a-aff2-43c1-9074-f0654f991d9c)\"" pod="openshift-multus/multus-jrmtf" podUID="8b52811a-aff2-43c1-9074-f0654f991d9c" Dec 17 09:06:38 crc kubenswrapper[4935]: I1217 09:06:38.123446 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:38 crc kubenswrapper[4935]: I1217 09:06:38.123476 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:38 crc kubenswrapper[4935]: I1217 09:06:38.123497 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:38 crc kubenswrapper[4935]: E1217 09:06:38.123663 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:38 crc kubenswrapper[4935]: E1217 09:06:38.123804 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:38 crc kubenswrapper[4935]: E1217 09:06:38.123924 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:38 crc kubenswrapper[4935]: I1217 09:06:38.831904 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/1.log" Dec 17 09:06:39 crc kubenswrapper[4935]: I1217 09:06:39.123202 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:39 crc kubenswrapper[4935]: E1217 09:06:39.123473 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:40 crc kubenswrapper[4935]: I1217 09:06:40.123905 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:40 crc kubenswrapper[4935]: I1217 09:06:40.123944 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:40 crc kubenswrapper[4935]: I1217 09:06:40.124003 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:40 crc kubenswrapper[4935]: E1217 09:06:40.124094 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:40 crc kubenswrapper[4935]: E1217 09:06:40.124200 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:40 crc kubenswrapper[4935]: E1217 09:06:40.124378 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:41 crc kubenswrapper[4935]: E1217 09:06:41.098505 4935 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 17 09:06:41 crc kubenswrapper[4935]: I1217 09:06:41.123812 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:41 crc kubenswrapper[4935]: E1217 09:06:41.125088 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:41 crc kubenswrapper[4935]: E1217 09:06:41.253483 4935 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 17 09:06:42 crc kubenswrapper[4935]: I1217 09:06:42.123457 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:42 crc kubenswrapper[4935]: E1217 09:06:42.123990 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:42 crc kubenswrapper[4935]: I1217 09:06:42.123509 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:42 crc kubenswrapper[4935]: I1217 09:06:42.123488 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:42 crc kubenswrapper[4935]: E1217 09:06:42.124488 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:42 crc kubenswrapper[4935]: E1217 09:06:42.124762 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:43 crc kubenswrapper[4935]: I1217 09:06:43.123628 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:43 crc kubenswrapper[4935]: E1217 09:06:43.123925 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:44 crc kubenswrapper[4935]: I1217 09:06:44.123671 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:44 crc kubenswrapper[4935]: E1217 09:06:44.123872 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:44 crc kubenswrapper[4935]: I1217 09:06:44.123986 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:44 crc kubenswrapper[4935]: I1217 09:06:44.124018 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:44 crc kubenswrapper[4935]: E1217 09:06:44.124467 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:44 crc kubenswrapper[4935]: E1217 09:06:44.124638 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:44 crc kubenswrapper[4935]: I1217 09:06:44.125035 4935 scope.go:117] "RemoveContainer" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:06:44 crc kubenswrapper[4935]: I1217 09:06:44.858970 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/3.log" Dec 17 09:06:44 crc kubenswrapper[4935]: I1217 09:06:44.861757 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerStarted","Data":"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330"} Dec 17 09:06:44 crc kubenswrapper[4935]: I1217 09:06:44.862250 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:06:44 crc kubenswrapper[4935]: I1217 09:06:44.889716 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podStartSLOduration=101.889688211 podStartE2EDuration="1m41.889688211s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:06:44.88770269 +0000 UTC m=+124.547543463" watchObservedRunningTime="2025-12-17 09:06:44.889688211 +0000 UTC m=+124.549528974" Dec 17 09:06:45 crc kubenswrapper[4935]: I1217 09:06:45.041028 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rg2z5"] Dec 17 09:06:45 crc kubenswrapper[4935]: I1217 09:06:45.041138 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:45 crc kubenswrapper[4935]: E1217 09:06:45.041232 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:46 crc kubenswrapper[4935]: I1217 09:06:46.123615 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:46 crc kubenswrapper[4935]: I1217 09:06:46.123615 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:46 crc kubenswrapper[4935]: E1217 09:06:46.123830 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:46 crc kubenswrapper[4935]: E1217 09:06:46.123866 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:46 crc kubenswrapper[4935]: I1217 09:06:46.123637 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:46 crc kubenswrapper[4935]: E1217 09:06:46.123952 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:46 crc kubenswrapper[4935]: E1217 09:06:46.255052 4935 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 17 09:06:47 crc kubenswrapper[4935]: I1217 09:06:47.123689 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:47 crc kubenswrapper[4935]: E1217 09:06:47.123943 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:48 crc kubenswrapper[4935]: I1217 09:06:48.123709 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:48 crc kubenswrapper[4935]: I1217 09:06:48.123760 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:48 crc kubenswrapper[4935]: I1217 09:06:48.123830 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:48 crc kubenswrapper[4935]: E1217 09:06:48.123848 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:48 crc kubenswrapper[4935]: E1217 09:06:48.123942 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:48 crc kubenswrapper[4935]: E1217 09:06:48.124019 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:49 crc kubenswrapper[4935]: I1217 09:06:49.124186 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:49 crc kubenswrapper[4935]: E1217 09:06:49.124423 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:50 crc kubenswrapper[4935]: I1217 09:06:50.123653 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:50 crc kubenswrapper[4935]: I1217 09:06:50.123806 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:50 crc kubenswrapper[4935]: I1217 09:06:50.123937 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:50 crc kubenswrapper[4935]: E1217 09:06:50.124088 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:50 crc kubenswrapper[4935]: E1217 09:06:50.123948 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:50 crc kubenswrapper[4935]: E1217 09:06:50.124358 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:51 crc kubenswrapper[4935]: I1217 09:06:51.124221 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:51 crc kubenswrapper[4935]: E1217 09:06:51.126563 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:51 crc kubenswrapper[4935]: I1217 09:06:51.127374 4935 scope.go:117] "RemoveContainer" containerID="f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a" Dec 17 09:06:51 crc kubenswrapper[4935]: E1217 09:06:51.256001 4935 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 17 09:06:51 crc kubenswrapper[4935]: I1217 09:06:51.894614 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/1.log" Dec 17 09:06:51 crc kubenswrapper[4935]: I1217 09:06:51.894678 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jrmtf" event={"ID":"8b52811a-aff2-43c1-9074-f0654f991d9c","Type":"ContainerStarted","Data":"11369d6fada4674292fe86adc7a89a8519d4860f1afdfbc6ee9bb6e4a1e3d22e"} Dec 17 09:06:52 crc kubenswrapper[4935]: I1217 09:06:52.124107 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:52 crc kubenswrapper[4935]: I1217 09:06:52.124106 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:52 crc kubenswrapper[4935]: I1217 09:06:52.124221 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:52 crc kubenswrapper[4935]: E1217 09:06:52.124658 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:52 crc kubenswrapper[4935]: E1217 09:06:52.124706 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:52 crc kubenswrapper[4935]: E1217 09:06:52.124516 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:53 crc kubenswrapper[4935]: I1217 09:06:53.123863 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:53 crc kubenswrapper[4935]: E1217 09:06:53.124050 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:54 crc kubenswrapper[4935]: I1217 09:06:54.123388 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:54 crc kubenswrapper[4935]: E1217 09:06:54.123512 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:54 crc kubenswrapper[4935]: I1217 09:06:54.123388 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:54 crc kubenswrapper[4935]: I1217 09:06:54.123607 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:54 crc kubenswrapper[4935]: E1217 09:06:54.123716 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:54 crc kubenswrapper[4935]: E1217 09:06:54.123764 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:55 crc kubenswrapper[4935]: I1217 09:06:55.123777 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:55 crc kubenswrapper[4935]: E1217 09:06:55.123980 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rg2z5" podUID="77feddc8-547a-42a0-baa3-19dd2915eb9f" Dec 17 09:06:56 crc kubenswrapper[4935]: I1217 09:06:56.123933 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:56 crc kubenswrapper[4935]: I1217 09:06:56.124078 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:56 crc kubenswrapper[4935]: I1217 09:06:56.124170 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:56 crc kubenswrapper[4935]: E1217 09:06:56.124168 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 17 09:06:56 crc kubenswrapper[4935]: E1217 09:06:56.124187 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 17 09:06:56 crc kubenswrapper[4935]: E1217 09:06:56.124297 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 17 09:06:57 crc kubenswrapper[4935]: I1217 09:06:57.123865 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:06:57 crc kubenswrapper[4935]: I1217 09:06:57.127083 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 17 09:06:57 crc kubenswrapper[4935]: I1217 09:06:57.127497 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 17 09:06:58 crc kubenswrapper[4935]: I1217 09:06:58.123816 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:06:58 crc kubenswrapper[4935]: I1217 09:06:58.123816 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:06:58 crc kubenswrapper[4935]: I1217 09:06:58.123834 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:06:58 crc kubenswrapper[4935]: I1217 09:06:58.127474 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 17 09:06:58 crc kubenswrapper[4935]: I1217 09:06:58.127492 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 17 09:06:58 crc kubenswrapper[4935]: I1217 09:06:58.127526 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 17 09:06:58 crc kubenswrapper[4935]: I1217 09:06:58.128417 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 17 09:07:00 crc kubenswrapper[4935]: I1217 09:07:00.889915 4935 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 17 09:07:00 crc kubenswrapper[4935]: I1217 09:07:00.929386 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mpgxh"] Dec 17 09:07:00 crc kubenswrapper[4935]: I1217 09:07:00.929983 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.089438 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.089485 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.089451 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.089667 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.089720 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.089837 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.090341 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.090352 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.090476 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.090724 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.090781 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.090810 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.090840 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.090867 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.090941 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.090971 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.091012 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-dir\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.091036 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.091063 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.091092 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.091106 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.091121 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.091246 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.091257 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w82mt\" (UniqueName: \"kubernetes.io/projected/e6bcddbf-eb05-4170-87db-6021b9da7df0-kube-api-access-w82mt\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.091347 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-policies\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.091817 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.092015 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k2b7h"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.092571 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.095363 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.097386 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.097791 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.098160 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b2nl7"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.098806 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.100228 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.100668 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.104362 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.108885 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-29mfs"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.109293 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-29mfs" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.110323 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.115336 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hq8r6"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.115962 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-nw6k6"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.116410 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.117170 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v725p"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.119180 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.118265 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.125481 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.125715 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.125482 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.126467 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.141038 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.142243 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.143260 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.152385 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.152615 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.152717 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.152901 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.153103 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.153225 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.153426 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.153696 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.174800 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.175114 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.175118 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.175320 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.175644 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.175765 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.176303 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.175241 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.177078 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.177330 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.177574 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.177743 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.178034 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.178134 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.178222 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.178416 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.178615 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.178701 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.178850 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.179504 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.179635 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.180100 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.180345 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.182616 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.185782 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.188660 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.189450 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ckj77"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.189562 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.189945 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.190372 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-l9zg8"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.190596 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.190837 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.191115 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.191155 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.191509 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.191882 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-policies\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.191914 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.191956 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.191977 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.191998 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.192023 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.192084 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.192108 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.192136 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-dir\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.192160 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.192178 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.192202 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.192230 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.192256 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w82mt\" (UniqueName: \"kubernetes.io/projected/e6bcddbf-eb05-4170-87db-6021b9da7df0-kube-api-access-w82mt\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.193040 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.193770 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.194445 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.194808 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.194985 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.195207 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-policies\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.195577 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.196653 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.196994 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.197161 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.197360 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.197598 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.197630 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.198531 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.198769 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.199436 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.199609 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.199871 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.200172 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.200377 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.200499 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.200923 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.201046 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.201154 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.201299 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.201781 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-dir\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.202213 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.202334 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.209772 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.204802 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.205888 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.206952 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.207011 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.204074 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.204181 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.204330 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.204571 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.204649 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.205140 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.205201 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.205322 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.205327 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.205382 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.205460 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.205527 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.205568 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.205571 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.211908 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.212102 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.212454 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.212474 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.212589 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.216902 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.212647 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.213092 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.213141 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.213192 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.213253 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.213519 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.213782 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.220173 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.220744 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.221239 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.222009 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.222320 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mpgxh"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.220811 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.222875 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.224590 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.227754 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hq8r6"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.228351 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w82mt\" (UniqueName: \"kubernetes.io/projected/e6bcddbf-eb05-4170-87db-6021b9da7df0-kube-api-access-w82mt\") pod \"oauth-openshift-558db77b4-mpgxh\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.230733 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh"] Dec 17 09:07:01 crc kubenswrapper[4935]: I1217 09:07:01.232723 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k2b7h"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261094 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261192 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fj52\" (UniqueName: \"kubernetes.io/projected/9c9294b4-aa19-4670-b826-77b9641ea149-kube-api-access-7fj52\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261236 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261295 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gczp\" (UniqueName: \"kubernetes.io/projected/771b6836-1fda-49f0-b7aa-2c65f7e81dad-kube-api-access-5gczp\") pod \"openshift-apiserver-operator-796bbdcf4f-pfnv2\" (UID: \"771b6836-1fda-49f0-b7aa-2c65f7e81dad\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261337 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0eba7d82-1203-49c2-bc66-7c77d2298a0d-encryption-config\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261355 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eba7d82-1203-49c2-bc66-7c77d2298a0d-audit-dir\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261375 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/722d5ccb-2d22-453f-b6d0-8eca00275efb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261400 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/722d5ccb-2d22-453f-b6d0-8eca00275efb-service-ca-bundle\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261420 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-trusted-ca-bundle\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261447 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwbgc\" (UniqueName: \"kubernetes.io/projected/f337d441-0527-46d0-98f4-a9323a682482-kube-api-access-mwbgc\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261474 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djhp8\" (UniqueName: \"kubernetes.io/projected/b44f7e9d-3f08-48de-a465-056daf8a4549-kube-api-access-djhp8\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261501 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-oauth-serving-cert\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261538 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722d5ccb-2d22-453f-b6d0-8eca00275efb-serving-cert\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261554 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssq5c\" (UniqueName: \"kubernetes.io/projected/6a47edc9-43df-40ba-8b92-80205500df3c-kube-api-access-ssq5c\") pod \"dns-operator-744455d44c-hq8r6\" (UID: \"6a47edc9-43df-40ba-8b92-80205500df3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261571 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/acc8184c-2eca-45cf-acc1-53501bde4e00-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261586 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b44f7e9d-3f08-48de-a465-056daf8a4549-audit-dir\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261603 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t4k5\" (UniqueName: \"kubernetes.io/projected/618adccc-479a-43b8-a44f-eb62ce26108a-kube-api-access-6t4k5\") pod \"downloads-7954f5f757-29mfs\" (UID: \"618adccc-479a-43b8-a44f-eb62ce26108a\") " pod="openshift-console/downloads-7954f5f757-29mfs" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261620 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/26bdd534-df87-4879-b036-377d8c606d5c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261635 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eba7d82-1203-49c2-bc66-7c77d2298a0d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261651 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.261678 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-service-ca\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.265787 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-l9zg8"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.267443 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.267608 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc8184c-2eca-45cf-acc1-53501bde4e00-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.267655 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4526\" (UniqueName: \"kubernetes.io/projected/acc8184c-2eca-45cf-acc1-53501bde4e00-kube-api-access-z4526\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.267723 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0eba7d82-1203-49c2-bc66-7c77d2298a0d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.267757 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eba7d82-1203-49c2-bc66-7c77d2298a0d-serving-cert\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.267851 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26bdd534-df87-4879-b036-377d8c606d5c-config\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.267913 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-29mfs"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.268210 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65m52\" (UniqueName: \"kubernetes.io/projected/26bdd534-df87-4879-b036-377d8c606d5c-kube-api-access-65m52\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.268385 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-config\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.268524 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eba7d82-1203-49c2-bc66-7c77d2298a0d-etcd-client\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.269377 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b44f7e9d-3f08-48de-a465-056daf8a4549-encryption-config\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.269431 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-serving-cert\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270219 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47twc\" (UniqueName: \"kubernetes.io/projected/0eba7d82-1203-49c2-bc66-7c77d2298a0d-kube-api-access-47twc\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270426 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdgq\" (UniqueName: \"kubernetes.io/projected/4334562a-5ab5-4601-932e-4a765f2d11a8-kube-api-access-6wdgq\") pod \"openshift-config-operator-7777fb866f-5dtrh\" (UID: \"4334562a-5ab5-4601-932e-4a765f2d11a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270486 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acc8184c-2eca-45cf-acc1-53501bde4e00-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270549 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9c9294b4-aa19-4670-b826-77b9641ea149-machine-approver-tls\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270607 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-oauth-config\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270674 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722d5ccb-2d22-453f-b6d0-8eca00275efb-config\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270727 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b44f7e9d-3f08-48de-a465-056daf8a4549-node-pullsecrets\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270791 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-etcd-serving-ca\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270814 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771b6836-1fda-49f0-b7aa-2c65f7e81dad-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pfnv2\" (UID: \"771b6836-1fda-49f0-b7aa-2c65f7e81dad\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270851 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bc5v9\" (UID: \"cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270935 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97rjm\" (UniqueName: \"kubernetes.io/projected/cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b-kube-api-access-97rjm\") pod \"cluster-samples-operator-665b6dd947-bc5v9\" (UID: \"cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.270990 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02137895-2767-439d-af36-1b95bb61aaeb-trusted-ca\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.271055 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eba7d82-1203-49c2-bc66-7c77d2298a0d-audit-policies\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.271083 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4334562a-5ab5-4601-932e-4a765f2d11a8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5dtrh\" (UID: \"4334562a-5ab5-4601-932e-4a765f2d11a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.271113 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-console-config\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.271147 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c9294b4-aa19-4670-b826-77b9641ea149-config\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.271168 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02137895-2767-439d-af36-1b95bb61aaeb-serving-cert\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.271184 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-audit\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.271204 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771b6836-1fda-49f0-b7aa-2c65f7e81dad-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pfnv2\" (UID: \"771b6836-1fda-49f0-b7aa-2c65f7e81dad\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.271222 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b44f7e9d-3f08-48de-a465-056daf8a4549-etcd-client\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.271243 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b44f7e9d-3f08-48de-a465-056daf8a4549-serving-cert\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.271261 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c9294b4-aa19-4670-b826-77b9641ea149-auth-proxy-config\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.272453 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-config\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.272511 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4334562a-5ab5-4601-932e-4a765f2d11a8-serving-cert\") pod \"openshift-config-operator-7777fb866f-5dtrh\" (UID: \"4334562a-5ab5-4601-932e-4a765f2d11a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.272545 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a47edc9-43df-40ba-8b92-80205500df3c-metrics-tls\") pod \"dns-operator-744455d44c-hq8r6\" (UID: \"6a47edc9-43df-40ba-8b92-80205500df3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.272643 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mq8w\" (UniqueName: \"kubernetes.io/projected/02137895-2767-439d-af36-1b95bb61aaeb-kube-api-access-6mq8w\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.272692 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-client-ca\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.272177 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.272773 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-image-import-ca\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.272925 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/26bdd534-df87-4879-b036-377d8c606d5c-images\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.273082 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b647c\" (UniqueName: \"kubernetes.io/projected/722d5ccb-2d22-453f-b6d0-8eca00275efb-kube-api-access-b647c\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.273173 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v725p"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.273776 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02137895-2767-439d-af36-1b95bb61aaeb-config\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.273834 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f337d441-0527-46d0-98f4-a9323a682482-serving-cert\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.273874 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpsw\" (UniqueName: \"kubernetes.io/projected/a074b884-bf31-47dc-9257-41a7d4dda13e-kube-api-access-ljpsw\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.275399 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.277183 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nw6k6"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.278557 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.279546 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b2nl7"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.281504 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ckj77"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.282459 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.284507 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.285937 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.285981 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gx689"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.287196 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.287739 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mqg4s"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.288825 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.291831 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.292340 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.294908 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.296089 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.296365 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g8v79"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.296563 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.296881 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.297041 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.297229 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.297435 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.297676 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.297866 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.297916 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.298983 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.299223 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.299471 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.299571 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.301120 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.301498 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gtvxm"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.301873 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.302247 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.303388 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.306016 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.306047 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.306286 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.306454 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.307039 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.307393 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.307804 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.307954 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.308053 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.308110 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.308260 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.308367 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.308118 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.308825 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.307896 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.309250 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.309265 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.309351 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.309406 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.309447 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.310933 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.311165 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.311894 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.312037 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.312718 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.312977 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.313133 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.313300 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.313474 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.313511 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.313646 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.314844 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.315932 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.321750 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jwnkk"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.324862 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.326038 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.326407 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.326580 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.326643 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.327121 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.327195 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.328019 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.328320 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.328390 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.328625 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.329114 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.329816 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.328638 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.328649 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.332559 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.333157 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.336633 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-79ntp"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.341182 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.341812 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.344478 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.347161 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-nmgrc"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.347844 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nmgrc" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.349636 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.350557 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bvl78"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.351250 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.352734 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.353361 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.353431 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.353559 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.353686 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.354135 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.355133 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4xq"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.355601 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.356852 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.357807 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.358371 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.358866 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.359720 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tdxp5"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.360413 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.361205 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-84n6g"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.362206 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.364444 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mqg4s"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.366107 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.367616 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nmgrc"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.369299 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.371517 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.373326 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.373671 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.374603 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.374848 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4334562a-5ab5-4601-932e-4a765f2d11a8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5dtrh\" (UID: \"4334562a-5ab5-4601-932e-4a765f2d11a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.374900 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-console-config\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.374930 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c9294b4-aa19-4670-b826-77b9641ea149-config\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375011 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02137895-2767-439d-af36-1b95bb61aaeb-serving-cert\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375039 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-audit\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375077 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771b6836-1fda-49f0-b7aa-2c65f7e81dad-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pfnv2\" (UID: \"771b6836-1fda-49f0-b7aa-2c65f7e81dad\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375116 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8591f8-8280-436e-a261-35cf66909b26-config\") pod \"service-ca-operator-777779d784-84n6g\" (UID: \"ca8591f8-8280-436e-a261-35cf66909b26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375145 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9r6j\" (UniqueName: \"kubernetes.io/projected/3207f41f-7cf8-4e0f-80d2-639237ce8b3e-kube-api-access-t9r6j\") pod \"machine-config-controller-84d6567774-8gmgm\" (UID: \"3207f41f-7cf8-4e0f-80d2-639237ce8b3e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375175 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p2fj\" (UniqueName: \"kubernetes.io/projected/336946bd-b29f-44dd-a54e-a9a023684726-kube-api-access-4p2fj\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375214 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89340f7a-cc56-44ef-8fbf-0e4efd9ce27e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qnwvp\" (UID: \"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375238 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-config\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375263 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b44f7e9d-3f08-48de-a465-056daf8a4549-etcd-client\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375311 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b44f7e9d-3f08-48de-a465-056daf8a4549-serving-cert\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375338 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c9294b4-aa19-4670-b826-77b9641ea149-auth-proxy-config\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375364 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89340f7a-cc56-44ef-8fbf-0e4efd9ce27e-config\") pod \"kube-controller-manager-operator-78b949d7b-qnwvp\" (UID: \"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375392 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a47edc9-43df-40ba-8b92-80205500df3c-metrics-tls\") pod \"dns-operator-744455d44c-hq8r6\" (UID: \"6a47edc9-43df-40ba-8b92-80205500df3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375416 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4334562a-5ab5-4601-932e-4a765f2d11a8-serving-cert\") pod \"openshift-config-operator-7777fb866f-5dtrh\" (UID: \"4334562a-5ab5-4601-932e-4a765f2d11a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375463 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mq8w\" (UniqueName: \"kubernetes.io/projected/02137895-2767-439d-af36-1b95bb61aaeb-kube-api-access-6mq8w\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375488 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-client-ca\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375519 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-image-import-ca\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375562 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/26bdd534-df87-4879-b036-377d8c606d5c-images\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375600 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b647c\" (UniqueName: \"kubernetes.io/projected/722d5ccb-2d22-453f-b6d0-8eca00275efb-kube-api-access-b647c\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375638 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02137895-2767-439d-af36-1b95bb61aaeb-config\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375677 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f337d441-0527-46d0-98f4-a9323a682482-serving-cert\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375703 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpsw\" (UniqueName: \"kubernetes.io/projected/a074b884-bf31-47dc-9257-41a7d4dda13e-kube-api-access-ljpsw\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.376223 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-console-config\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.375729 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fj52\" (UniqueName: \"kubernetes.io/projected/9c9294b4-aa19-4670-b826-77b9641ea149-kube-api-access-7fj52\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.376378 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.376402 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gczp\" (UniqueName: \"kubernetes.io/projected/771b6836-1fda-49f0-b7aa-2c65f7e81dad-kube-api-access-5gczp\") pod \"openshift-apiserver-operator-796bbdcf4f-pfnv2\" (UID: \"771b6836-1fda-49f0-b7aa-2c65f7e81dad\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.376425 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8823e665-73c6-4f33-a6c2-18a8a750abb9-serving-cert\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.376446 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/336946bd-b29f-44dd-a54e-a9a023684726-apiservice-cert\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.376478 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eba7d82-1203-49c2-bc66-7c77d2298a0d-audit-dir\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.376499 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4j4w\" (UniqueName: \"kubernetes.io/projected/d86831b4-8c67-47a1-bc74-8db8e8399b26-kube-api-access-k4j4w\") pod \"package-server-manager-789f6589d5-4lw55\" (UID: \"d86831b4-8c67-47a1-bc74-8db8e8399b26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.376524 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0eba7d82-1203-49c2-bc66-7c77d2298a0d-encryption-config\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.376547 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/722d5ccb-2d22-453f-b6d0-8eca00275efb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.376820 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4334562a-5ab5-4601-932e-4a765f2d11a8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5dtrh\" (UID: \"4334562a-5ab5-4601-932e-4a765f2d11a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377010 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9c9294b4-aa19-4670-b826-77b9641ea149-auth-proxy-config\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377171 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/722d5ccb-2d22-453f-b6d0-8eca00275efb-service-ca-bundle\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377217 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-trusted-ca-bundle\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377238 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722d5ccb-2d22-453f-b6d0-8eca00275efb-serving-cert\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377257 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwbgc\" (UniqueName: \"kubernetes.io/projected/f337d441-0527-46d0-98f4-a9323a682482-kube-api-access-mwbgc\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377287 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djhp8\" (UniqueName: \"kubernetes.io/projected/b44f7e9d-3f08-48de-a465-056daf8a4549-kube-api-access-djhp8\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377307 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-oauth-serving-cert\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377390 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3207f41f-7cf8-4e0f-80d2-639237ce8b3e-proxy-tls\") pod \"machine-config-controller-84d6567774-8gmgm\" (UID: \"3207f41f-7cf8-4e0f-80d2-639237ce8b3e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377415 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssq5c\" (UniqueName: \"kubernetes.io/projected/6a47edc9-43df-40ba-8b92-80205500df3c-kube-api-access-ssq5c\") pod \"dns-operator-744455d44c-hq8r6\" (UID: \"6a47edc9-43df-40ba-8b92-80205500df3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377434 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/acc8184c-2eca-45cf-acc1-53501bde4e00-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377474 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b44f7e9d-3f08-48de-a465-056daf8a4549-audit-dir\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377499 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377505 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t4k5\" (UniqueName: \"kubernetes.io/projected/618adccc-479a-43b8-a44f-eb62ce26108a-kube-api-access-6t4k5\") pod \"downloads-7954f5f757-29mfs\" (UID: \"618adccc-479a-43b8-a44f-eb62ce26108a\") " pod="openshift-console/downloads-7954f5f757-29mfs" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377586 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d86831b4-8c67-47a1-bc74-8db8e8399b26-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4lw55\" (UID: \"d86831b4-8c67-47a1-bc74-8db8e8399b26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377632 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/26bdd534-df87-4879-b036-377d8c606d5c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377666 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eba7d82-1203-49c2-bc66-7c77d2298a0d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377693 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3207f41f-7cf8-4e0f-80d2-639237ce8b3e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8gmgm\" (UID: \"3207f41f-7cf8-4e0f-80d2-639237ce8b3e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377721 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377747 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-service-ca\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377772 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc8184c-2eca-45cf-acc1-53501bde4e00-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377800 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4526\" (UniqueName: \"kubernetes.io/projected/acc8184c-2eca-45cf-acc1-53501bde4e00-kube-api-access-z4526\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377844 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0eba7d82-1203-49c2-bc66-7c77d2298a0d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377836 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377858 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-image-import-ca\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377874 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eba7d82-1203-49c2-bc66-7c77d2298a0d-serving-cert\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377906 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrd4l\" (UniqueName: \"kubernetes.io/projected/ca8591f8-8280-436e-a261-35cf66909b26-kube-api-access-vrd4l\") pod \"service-ca-operator-777779d784-84n6g\" (UID: \"ca8591f8-8280-436e-a261-35cf66909b26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377939 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26bdd534-df87-4879-b036-377d8c606d5c-config\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377966 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65m52\" (UniqueName: \"kubernetes.io/projected/26bdd534-df87-4879-b036-377d8c606d5c-kube-api-access-65m52\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377993 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-config\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378036 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eba7d82-1203-49c2-bc66-7c77d2298a0d-etcd-client\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378064 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-client-ca\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378089 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b44f7e9d-3f08-48de-a465-056daf8a4549-encryption-config\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378105 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-serving-cert\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378125 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8591f8-8280-436e-a261-35cf66909b26-serving-cert\") pod \"service-ca-operator-777779d784-84n6g\" (UID: \"ca8591f8-8280-436e-a261-35cf66909b26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378145 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47twc\" (UniqueName: \"kubernetes.io/projected/0eba7d82-1203-49c2-bc66-7c77d2298a0d-kube-api-access-47twc\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378165 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89340f7a-cc56-44ef-8fbf-0e4efd9ce27e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qnwvp\" (UID: \"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378182 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9c9294b4-aa19-4670-b826-77b9641ea149-machine-approver-tls\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378198 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdgq\" (UniqueName: \"kubernetes.io/projected/4334562a-5ab5-4601-932e-4a765f2d11a8-kube-api-access-6wdgq\") pod \"openshift-config-operator-7777fb866f-5dtrh\" (UID: \"4334562a-5ab5-4601-932e-4a765f2d11a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378213 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acc8184c-2eca-45cf-acc1-53501bde4e00-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378234 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-oauth-config\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378254 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722d5ccb-2d22-453f-b6d0-8eca00275efb-config\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378288 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/336946bd-b29f-44dd-a54e-a9a023684726-webhook-cert\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378308 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b44f7e9d-3f08-48de-a465-056daf8a4549-node-pullsecrets\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378311 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-audit\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378328 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-etcd-serving-ca\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378352 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771b6836-1fda-49f0-b7aa-2c65f7e81dad-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pfnv2\" (UID: \"771b6836-1fda-49f0-b7aa-2c65f7e81dad\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378377 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bc5v9\" (UID: \"cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378400 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqcg7\" (UniqueName: \"kubernetes.io/projected/8823e665-73c6-4f33-a6c2-18a8a750abb9-kube-api-access-nqcg7\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378435 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97rjm\" (UniqueName: \"kubernetes.io/projected/cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b-kube-api-access-97rjm\") pod \"cluster-samples-operator-665b6dd947-bc5v9\" (UID: \"cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378451 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/336946bd-b29f-44dd-a54e-a9a023684726-tmpfs\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378475 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02137895-2767-439d-af36-1b95bb61aaeb-trusted-ca\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378492 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eba7d82-1203-49c2-bc66-7c77d2298a0d-audit-policies\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.378510 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-config\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.379001 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/771b6836-1fda-49f0-b7aa-2c65f7e81dad-config\") pod \"openshift-apiserver-operator-796bbdcf4f-pfnv2\" (UID: \"771b6836-1fda-49f0-b7aa-2c65f7e81dad\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.379403 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g8v79"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.379443 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/26bdd534-df87-4879-b036-377d8c606d5c-images\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.377507 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c9294b4-aa19-4670-b826-77b9641ea149-config\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.383067 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-config\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.383196 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6a47edc9-43df-40ba-8b92-80205500df3c-metrics-tls\") pod \"dns-operator-744455d44c-hq8r6\" (UID: \"6a47edc9-43df-40ba-8b92-80205500df3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.384038 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tdxp5"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.384492 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.384623 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0eba7d82-1203-49c2-bc66-7c77d2298a0d-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.384732 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.384931 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02137895-2767-439d-af36-1b95bb61aaeb-serving-cert\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.385167 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0eba7d82-1203-49c2-bc66-7c77d2298a0d-audit-dir\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.385218 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-service-ca\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.385784 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b44f7e9d-3f08-48de-a465-056daf8a4549-etcd-client\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.385870 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.385935 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02137895-2767-439d-af36-1b95bb61aaeb-config\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.386331 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-79ntp"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.386959 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-config\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.387097 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/722d5ccb-2d22-453f-b6d0-8eca00275efb-serving-cert\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.387872 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-oauth-serving-cert\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.388494 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f337d441-0527-46d0-98f4-a9323a682482-serving-cert\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.389247 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/0eba7d82-1203-49c2-bc66-7c77d2298a0d-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.390565 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0eba7d82-1203-49c2-bc66-7c77d2298a0d-etcd-client\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.391161 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b44f7e9d-3f08-48de-a465-056daf8a4549-node-pullsecrets\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.391379 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/722d5ccb-2d22-453f-b6d0-8eca00275efb-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.392824 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b44f7e9d-3f08-48de-a465-056daf8a4549-audit-dir\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.393430 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-trusted-ca-bundle\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.393559 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26bdd534-df87-4879-b036-377d8c606d5c-config\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.393570 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02137895-2767-439d-af36-1b95bb61aaeb-trusted-ca\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.393618 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.393860 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/acc8184c-2eca-45cf-acc1-53501bde4e00-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.393899 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-client-ca\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.393936 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.393933 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-84n6g"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.395043 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722d5ccb-2d22-453f-b6d0-8eca00275efb-config\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.395001 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/722d5ccb-2d22-453f-b6d0-8eca00275efb-service-ca-bundle\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.395076 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b44f7e9d-3f08-48de-a465-056daf8a4549-etcd-serving-ca\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.396215 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0eba7d82-1203-49c2-bc66-7c77d2298a0d-audit-policies\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.396494 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0eba7d82-1203-49c2-bc66-7c77d2298a0d-serving-cert\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.396645 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/0eba7d82-1203-49c2-bc66-7c77d2298a0d-encryption-config\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.396671 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.396838 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b44f7e9d-3f08-48de-a465-056daf8a4549-serving-cert\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.397057 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/771b6836-1fda-49f0-b7aa-2c65f7e81dad-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-pfnv2\" (UID: \"771b6836-1fda-49f0-b7aa-2c65f7e81dad\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.397828 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.399024 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.402603 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gx689"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.403994 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4xq"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.405252 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.405817 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/9c9294b4-aa19-4670-b826-77b9641ea149-machine-approver-tls\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.406694 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-oauth-config\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.406804 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-bc5v9\" (UID: \"cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.406817 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/acc8184c-2eca-45cf-acc1-53501bde4e00-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.406905 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4334562a-5ab5-4601-932e-4a765f2d11a8-serving-cert\") pod \"openshift-config-operator-7777fb866f-5dtrh\" (UID: \"4334562a-5ab5-4601-932e-4a765f2d11a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.409015 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-js55g"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.409878 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bvl78"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.409985 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.411399 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/26bdd534-df87-4879-b036-377d8c606d5c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.411607 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-serving-cert\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.411875 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b44f7e9d-3f08-48de-a465-056daf8a4549-encryption-config\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.413527 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.423228 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.423316 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jwnkk"] Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.433061 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.478787 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482430 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-config\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482472 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9r6j\" (UniqueName: \"kubernetes.io/projected/3207f41f-7cf8-4e0f-80d2-639237ce8b3e-kube-api-access-t9r6j\") pod \"machine-config-controller-84d6567774-8gmgm\" (UID: \"3207f41f-7cf8-4e0f-80d2-639237ce8b3e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482495 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p2fj\" (UniqueName: \"kubernetes.io/projected/336946bd-b29f-44dd-a54e-a9a023684726-kube-api-access-4p2fj\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482512 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89340f7a-cc56-44ef-8fbf-0e4efd9ce27e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qnwvp\" (UID: \"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482528 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8591f8-8280-436e-a261-35cf66909b26-config\") pod \"service-ca-operator-777779d784-84n6g\" (UID: \"ca8591f8-8280-436e-a261-35cf66909b26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482546 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89340f7a-cc56-44ef-8fbf-0e4efd9ce27e-config\") pod \"kube-controller-manager-operator-78b949d7b-qnwvp\" (UID: \"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482619 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8823e665-73c6-4f33-a6c2-18a8a750abb9-serving-cert\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482638 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/336946bd-b29f-44dd-a54e-a9a023684726-apiservice-cert\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482669 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4j4w\" (UniqueName: \"kubernetes.io/projected/d86831b4-8c67-47a1-bc74-8db8e8399b26-kube-api-access-k4j4w\") pod \"package-server-manager-789f6589d5-4lw55\" (UID: \"d86831b4-8c67-47a1-bc74-8db8e8399b26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482696 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3207f41f-7cf8-4e0f-80d2-639237ce8b3e-proxy-tls\") pod \"machine-config-controller-84d6567774-8gmgm\" (UID: \"3207f41f-7cf8-4e0f-80d2-639237ce8b3e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482744 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d86831b4-8c67-47a1-bc74-8db8e8399b26-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4lw55\" (UID: \"d86831b4-8c67-47a1-bc74-8db8e8399b26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482765 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3207f41f-7cf8-4e0f-80d2-639237ce8b3e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8gmgm\" (UID: \"3207f41f-7cf8-4e0f-80d2-639237ce8b3e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482790 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrd4l\" (UniqueName: \"kubernetes.io/projected/ca8591f8-8280-436e-a261-35cf66909b26-kube-api-access-vrd4l\") pod \"service-ca-operator-777779d784-84n6g\" (UID: \"ca8591f8-8280-436e-a261-35cf66909b26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482815 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-client-ca\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482859 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8591f8-8280-436e-a261-35cf66909b26-serving-cert\") pod \"service-ca-operator-777779d784-84n6g\" (UID: \"ca8591f8-8280-436e-a261-35cf66909b26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482888 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89340f7a-cc56-44ef-8fbf-0e4efd9ce27e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qnwvp\" (UID: \"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482916 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/336946bd-b29f-44dd-a54e-a9a023684726-webhook-cert\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482935 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqcg7\" (UniqueName: \"kubernetes.io/projected/8823e665-73c6-4f33-a6c2-18a8a750abb9-kube-api-access-nqcg7\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.482967 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/336946bd-b29f-44dd-a54e-a9a023684726-tmpfs\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.483623 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/336946bd-b29f-44dd-a54e-a9a023684726-tmpfs\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.484799 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-client-ca\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.484901 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3207f41f-7cf8-4e0f-80d2-639237ce8b3e-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-8gmgm\" (UID: \"3207f41f-7cf8-4e0f-80d2-639237ce8b3e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.485414 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-config\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.485599 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89340f7a-cc56-44ef-8fbf-0e4efd9ce27e-config\") pod \"kube-controller-manager-operator-78b949d7b-qnwvp\" (UID: \"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.487759 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3207f41f-7cf8-4e0f-80d2-639237ce8b3e-proxy-tls\") pod \"machine-config-controller-84d6567774-8gmgm\" (UID: \"3207f41f-7cf8-4e0f-80d2-639237ce8b3e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.488791 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8823e665-73c6-4f33-a6c2-18a8a750abb9-serving-cert\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.488943 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89340f7a-cc56-44ef-8fbf-0e4efd9ce27e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-qnwvp\" (UID: \"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.491131 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d86831b4-8c67-47a1-bc74-8db8e8399b26-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-4lw55\" (UID: \"d86831b4-8c67-47a1-bc74-8db8e8399b26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.493197 4935 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.493429 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mpgxh"] Dec 17 09:07:02 crc kubenswrapper[4935]: W1217 09:07:02.503595 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6bcddbf_eb05_4170_87db_6021b9da7df0.slice/crio-016cd799aff8c35048a8743e006c6a6599d9f2a4f3109b354579cb31985363ab WatchSource:0}: Error finding container 016cd799aff8c35048a8743e006c6a6599d9f2a4f3109b354579cb31985363ab: Status 404 returned error can't find the container with id 016cd799aff8c35048a8743e006c6a6599d9f2a4f3109b354579cb31985363ab Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.512626 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.533405 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.553167 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.573884 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.594680 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.612901 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.633431 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.652010 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.673501 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.692990 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.715077 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.734309 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.754329 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.772881 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.793771 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.813180 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.841370 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.852853 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.873677 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.893538 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.914011 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.933604 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.953234 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.959122 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/336946bd-b29f-44dd-a54e-a9a023684726-apiservice-cert\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.959169 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/336946bd-b29f-44dd-a54e-a9a023684726-webhook-cert\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.973070 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 17 09:07:02 crc kubenswrapper[4935]: I1217 09:07:02.994142 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.013252 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.033865 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.053416 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.056552 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca8591f8-8280-436e-a261-35cf66909b26-serving-cert\") pod \"service-ca-operator-777779d784-84n6g\" (UID: \"ca8591f8-8280-436e-a261-35cf66909b26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.073160 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.093491 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.094365 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca8591f8-8280-436e-a261-35cf66909b26-config\") pod \"service-ca-operator-777779d784-84n6g\" (UID: \"ca8591f8-8280-436e-a261-35cf66909b26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.112903 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.169933 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t4k5\" (UniqueName: \"kubernetes.io/projected/618adccc-479a-43b8-a44f-eb62ce26108a-kube-api-access-6t4k5\") pod \"downloads-7954f5f757-29mfs\" (UID: \"618adccc-479a-43b8-a44f-eb62ce26108a\") " pod="openshift-console/downloads-7954f5f757-29mfs" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.176858 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-29mfs" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.193903 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65m52\" (UniqueName: \"kubernetes.io/projected/26bdd534-df87-4879-b036-377d8c606d5c-kube-api-access-65m52\") pod \"machine-api-operator-5694c8668f-b2nl7\" (UID: \"26bdd534-df87-4879-b036-377d8c606d5c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.209799 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mq8w\" (UniqueName: \"kubernetes.io/projected/02137895-2767-439d-af36-1b95bb61aaeb-kube-api-access-6mq8w\") pod \"console-operator-58897d9998-l9zg8\" (UID: \"02137895-2767-439d-af36-1b95bb61aaeb\") " pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.228737 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b647c\" (UniqueName: \"kubernetes.io/projected/722d5ccb-2d22-453f-b6d0-8eca00275efb-kube-api-access-b647c\") pod \"authentication-operator-69f744f599-k2b7h\" (UID: \"722d5ccb-2d22-453f-b6d0-8eca00275efb\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.240864 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.252720 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4526\" (UniqueName: \"kubernetes.io/projected/acc8184c-2eca-45cf-acc1-53501bde4e00-kube-api-access-z4526\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.276411 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gczp\" (UniqueName: \"kubernetes.io/projected/771b6836-1fda-49f0-b7aa-2c65f7e81dad-kube-api-access-5gczp\") pod \"openshift-apiserver-operator-796bbdcf4f-pfnv2\" (UID: \"771b6836-1fda-49f0-b7aa-2c65f7e81dad\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.277918 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" event={"ID":"e6bcddbf-eb05-4170-87db-6021b9da7df0","Type":"ContainerStarted","Data":"0e6ad3573cd4baf01ff2cd6cca17eca16420f168dc4690a74dd17af6ac29c255"} Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.277987 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" event={"ID":"e6bcddbf-eb05-4170-87db-6021b9da7df0","Type":"ContainerStarted","Data":"016cd799aff8c35048a8743e006c6a6599d9f2a4f3109b354579cb31985363ab"} Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.278264 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.289347 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fj52\" (UniqueName: \"kubernetes.io/projected/9c9294b4-aa19-4670-b826-77b9641ea149-kube-api-access-7fj52\") pod \"machine-approver-56656f9798-hmdsg\" (UID: \"9c9294b4-aa19-4670-b826-77b9641ea149\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.316043 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwbgc\" (UniqueName: \"kubernetes.io/projected/f337d441-0527-46d0-98f4-a9323a682482-kube-api-access-mwbgc\") pod \"controller-manager-879f6c89f-v725p\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.339477 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djhp8\" (UniqueName: \"kubernetes.io/projected/b44f7e9d-3f08-48de-a465-056daf8a4549-kube-api-access-djhp8\") pod \"apiserver-76f77b778f-ckj77\" (UID: \"b44f7e9d-3f08-48de-a465-056daf8a4549\") " pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.350668 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.350949 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.361551 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.365165 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-29mfs"] Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.370575 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpsw\" (UniqueName: \"kubernetes.io/projected/a074b884-bf31-47dc-9257-41a7d4dda13e-kube-api-access-ljpsw\") pod \"console-f9d7485db-nw6k6\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.373872 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssq5c\" (UniqueName: \"kubernetes.io/projected/6a47edc9-43df-40ba-8b92-80205500df3c-kube-api-access-ssq5c\") pod \"dns-operator-744455d44c-hq8r6\" (UID: \"6a47edc9-43df-40ba-8b92-80205500df3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.393764 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/acc8184c-2eca-45cf-acc1-53501bde4e00-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-vkgt6\" (UID: \"acc8184c-2eca-45cf-acc1-53501bde4e00\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.408841 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdgq\" (UniqueName: \"kubernetes.io/projected/4334562a-5ab5-4601-932e-4a765f2d11a8-kube-api-access-6wdgq\") pod \"openshift-config-operator-7777fb866f-5dtrh\" (UID: \"4334562a-5ab5-4601-932e-4a765f2d11a8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.411893 4935 request.go:700] Waited for 1.020443866s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.433202 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97rjm\" (UniqueName: \"kubernetes.io/projected/cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b-kube-api-access-97rjm\") pod \"cluster-samples-operator-665b6dd947-bc5v9\" (UID: \"cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.449568 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-k2b7h"] Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.450736 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47twc\" (UniqueName: \"kubernetes.io/projected/0eba7d82-1203-49c2-bc66-7c77d2298a0d-kube-api-access-47twc\") pod \"apiserver-7bbb656c7d-r8svh\" (UID: \"0eba7d82-1203-49c2-bc66-7c77d2298a0d\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.454787 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.473830 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.494241 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.516949 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.547797 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.549237 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-l9zg8"] Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.552115 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.552633 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89340f7a-cc56-44ef-8fbf-0e4efd9ce27e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-qnwvp\" (UID: \"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.558447 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.566560 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.568475 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-b2nl7"] Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.568633 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9r6j\" (UniqueName: \"kubernetes.io/projected/3207f41f-7cf8-4e0f-80d2-639237ce8b3e-kube-api-access-t9r6j\") pod \"machine-config-controller-84d6567774-8gmgm\" (UID: \"3207f41f-7cf8-4e0f-80d2-639237ce8b3e\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.582510 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.588709 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.589158 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqcg7\" (UniqueName: \"kubernetes.io/projected/8823e665-73c6-4f33-a6c2-18a8a750abb9-kube-api-access-nqcg7\") pod \"route-controller-manager-6576b87f9c-mmwcj\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.608631 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4j4w\" (UniqueName: \"kubernetes.io/projected/d86831b4-8c67-47a1-bc74-8db8e8399b26-kube-api-access-k4j4w\") pod \"package-server-manager-789f6589d5-4lw55\" (UID: \"d86831b4-8c67-47a1-bc74-8db8e8399b26\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.628533 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p2fj\" (UniqueName: \"kubernetes.io/projected/336946bd-b29f-44dd-a54e-a9a023684726-kube-api-access-4p2fj\") pod \"packageserver-d55dfcdfc-tfngw\" (UID: \"336946bd-b29f-44dd-a54e-a9a023684726\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.636825 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.642799 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:03 crc kubenswrapper[4935]: W1217 09:07:03.643525 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod722d5ccb_2d22_453f_b6d0_8eca00275efb.slice/crio-2a20bcaf6d70ff41c7d9600c9eeafc3d7c32d24db2b65f6e301db9da889e8476 WatchSource:0}: Error finding container 2a20bcaf6d70ff41c7d9600c9eeafc3d7c32d24db2b65f6e301db9da889e8476: Status 404 returned error can't find the container with id 2a20bcaf6d70ff41c7d9600c9eeafc3d7c32d24db2b65f6e301db9da889e8476 Dec 17 09:07:03 crc kubenswrapper[4935]: W1217 09:07:03.644776 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02137895_2767_439d_af36_1b95bb61aaeb.slice/crio-f0abc108cd74819b2753e64d777c719183e6fcd998ab368e00f628a81bf84596 WatchSource:0}: Error finding container f0abc108cd74819b2753e64d777c719183e6fcd998ab368e00f628a81bf84596: Status 404 returned error can't find the container with id f0abc108cd74819b2753e64d777c719183e6fcd998ab368e00f628a81bf84596 Dec 17 09:07:03 crc kubenswrapper[4935]: W1217 09:07:03.646782 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26bdd534_df87_4879_b036_377d8c606d5c.slice/crio-1c4f341892168f24ebe4ded92d5457f369d864adb39bf500637722eae2687ab3 WatchSource:0}: Error finding container 1c4f341892168f24ebe4ded92d5457f369d864adb39bf500637722eae2687ab3: Status 404 returned error can't find the container with id 1c4f341892168f24ebe4ded92d5457f369d864adb39bf500637722eae2687ab3 Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.654056 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrd4l\" (UniqueName: \"kubernetes.io/projected/ca8591f8-8280-436e-a261-35cf66909b26-kube-api-access-vrd4l\") pod \"service-ca-operator-777779d784-84n6g\" (UID: \"ca8591f8-8280-436e-a261-35cf66909b26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.656376 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.669047 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.683196 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.701552 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.701882 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.701924 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-socket-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702003 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a03e09d-6bb6-4241-b1f7-24f864b05640-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k7vbl\" (UID: \"8a03e09d-6bb6-4241-b1f7-24f864b05640\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702027 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfrhf\" (UniqueName: \"kubernetes.io/projected/9861106b-bcb8-49a2-93a3-14a548a26c57-kube-api-access-tfrhf\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702045 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/50ad2e0b-8528-481e-af4f-f7343d62878a-srv-cert\") pod \"catalog-operator-68c6474976-k6mmw\" (UID: \"50ad2e0b-8528-481e-af4f-f7343d62878a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702070 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8befec57-6330-4505-bdf6-f9fe6667a8bf-signing-cabundle\") pod \"service-ca-9c57cc56f-jwnkk\" (UID: \"8befec57-6330-4505-bdf6-f9fe6667a8bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702087 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-485g5\" (UniqueName: \"kubernetes.io/projected/60032bf5-af40-4d89-a7e3-e2e8da6382a3-kube-api-access-485g5\") pod \"marketplace-operator-79b997595-5j4xq\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702108 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5j4xq\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702126 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb17d1b3-11c8-491b-bf10-35bd9142ad4c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bcz66\" (UID: \"fb17d1b3-11c8-491b-bf10-35bd9142ad4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702144 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a03e09d-6bb6-4241-b1f7-24f864b05640-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k7vbl\" (UID: \"8a03e09d-6bb6-4241-b1f7-24f864b05640\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702162 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-secret-volume\") pod \"collect-profiles-29432700-jt6zc\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702178 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/503b0e00-0b2a-4d33-a742-0dc3a2a73343-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bvl78\" (UID: \"503b0e00-0b2a-4d33-a742-0dc3a2a73343\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702223 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v5tz\" (UniqueName: \"kubernetes.io/projected/ac56f58a-57ad-4d33-931d-2c00504e09fa-kube-api-access-7v5tz\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702247 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac56f58a-57ad-4d33-931d-2c00504e09fa-etcd-client\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702267 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlbv\" (UniqueName: \"kubernetes.io/projected/50ad2e0b-8528-481e-af4f-f7343d62878a-kube-api-access-7rlbv\") pod \"catalog-operator-68c6474976-k6mmw\" (UID: \"50ad2e0b-8528-481e-af4f-f7343d62878a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702311 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb17d1b3-11c8-491b-bf10-35bd9142ad4c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bcz66\" (UID: \"fb17d1b3-11c8-491b-bf10-35bd9142ad4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702361 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e69422fb-c984-49c7-98db-a055b29fa457-proxy-tls\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702405 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/84312d40-3400-410d-9ba1-952f8ffbd442-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7j8g\" (UID: \"84312d40-3400-410d-9ba1-952f8ffbd442\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702445 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36722ee1-e3e3-4533-902a-038a215f40fe-config\") pod \"kube-apiserver-operator-766d6c64bb-8nsh2\" (UID: \"36722ee1-e3e3-4533-902a-038a215f40fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702481 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f54f534-8232-4771-a97b-5ce4f29b8a3d-metrics-tls\") pod \"dns-default-tdxp5\" (UID: \"2f54f534-8232-4771-a97b-5ce4f29b8a3d\") " pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702502 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9861106b-bcb8-49a2-93a3-14a548a26c57-default-certificate\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702533 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2aea1606-ff6f-4325-9f92-c83e2c5079c0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702557 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/830ca689-7f10-4899-b1fc-1d88feecf243-cert\") pod \"ingress-canary-nmgrc\" (UID: \"830ca689-7f10-4899-b1fc-1d88feecf243\") " pod="openshift-ingress-canary/ingress-canary-nmgrc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702578 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f54f534-8232-4771-a97b-5ce4f29b8a3d-config-volume\") pod \"dns-default-tdxp5\" (UID: \"2f54f534-8232-4771-a97b-5ce4f29b8a3d\") " pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702601 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac56f58a-57ad-4d33-931d-2c00504e09fa-etcd-service-ca\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702644 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9861106b-bcb8-49a2-93a3-14a548a26c57-metrics-certs\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702677 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2aea1606-ff6f-4325-9f92-c83e2c5079c0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702698 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-plugins-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702719 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh8jl\" (UniqueName: \"kubernetes.io/projected/e69422fb-c984-49c7-98db-a055b29fa457-kube-api-access-gh8jl\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702757 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-tls\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702776 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfhmx\" (UniqueName: \"kubernetes.io/projected/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-kube-api-access-vfhmx\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.702794 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf4h7\" (UniqueName: \"kubernetes.io/projected/8a03e09d-6bb6-4241-b1f7-24f864b05640-kube-api-access-pf4h7\") pod \"kube-storage-version-migrator-operator-b67b599dd-k7vbl\" (UID: \"8a03e09d-6bb6-4241-b1f7-24f864b05640\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.704119 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36722ee1-e3e3-4533-902a-038a215f40fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8nsh2\" (UID: \"36722ee1-e3e3-4533-902a-038a215f40fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.705017 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9861106b-bcb8-49a2-93a3-14a548a26c57-stats-auth\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.705075 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e69422fb-c984-49c7-98db-a055b29fa457-images\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.705526 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44zp2\" (UniqueName: \"kubernetes.io/projected/84312d40-3400-410d-9ba1-952f8ffbd442-kube-api-access-44zp2\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7j8g\" (UID: \"84312d40-3400-410d-9ba1-952f8ffbd442\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" Dec 17 09:07:03 crc kubenswrapper[4935]: E1217 09:07:03.705678 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:04.205660456 +0000 UTC m=+143.865501419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.705726 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-metrics-tls\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.706255 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac56f58a-57ad-4d33-931d-2c00504e09fa-config\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.706497 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-registration-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.706549 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vdnt\" (UniqueName: \"kubernetes.io/projected/888de074-122c-4990-8dc9-c1fa8b4361af-kube-api-access-6vdnt\") pod \"migrator-59844c95c7-nc7cp\" (UID: \"888de074-122c-4990-8dc9-c1fa8b4361af\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.706600 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-bound-sa-token\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.706863 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-mountpoint-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.706907 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/50ad2e0b-8528-481e-af4f-f7343d62878a-profile-collector-cert\") pod \"catalog-operator-68c6474976-k6mmw\" (UID: \"50ad2e0b-8528-481e-af4f-f7343d62878a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707033 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2bs\" (UniqueName: \"kubernetes.io/projected/503b0e00-0b2a-4d33-a742-0dc3a2a73343-kube-api-access-vh2bs\") pod \"multus-admission-controller-857f4d67dd-bvl78\" (UID: \"503b0e00-0b2a-4d33-a742-0dc3a2a73343\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707092 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dggwp\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-kube-api-access-dggwp\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707123 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqft7\" (UniqueName: \"kubernetes.io/projected/8befec57-6330-4505-bdf6-f9fe6667a8bf-kube-api-access-jqft7\") pod \"service-ca-9c57cc56f-jwnkk\" (UID: \"8befec57-6330-4505-bdf6-f9fe6667a8bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707150 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9861106b-bcb8-49a2-93a3-14a548a26c57-service-ca-bundle\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707176 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhl94\" (UniqueName: \"kubernetes.io/projected/7b089dbd-e3e7-457c-a781-92585ac48eb9-kube-api-access-hhl94\") pod \"olm-operator-6b444d44fb-2nltt\" (UID: \"7b089dbd-e3e7-457c-a781-92585ac48eb9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707253 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-certificates\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707350 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8befec57-6330-4505-bdf6-f9fe6667a8bf-signing-key\") pod \"service-ca-9c57cc56f-jwnkk\" (UID: \"8befec57-6330-4505-bdf6-f9fe6667a8bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707382 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49a58e6e-ca3c-438f-b468-06d2f3ed7050-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wmgg7\" (UID: \"49a58e6e-ca3c-438f-b468-06d2f3ed7050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707438 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2kk9\" (UniqueName: \"kubernetes.io/projected/2f54f534-8232-4771-a97b-5ce4f29b8a3d-kube-api-access-l2kk9\") pod \"dns-default-tdxp5\" (UID: \"2f54f534-8232-4771-a97b-5ce4f29b8a3d\") " pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707697 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ac56f58a-57ad-4d33-931d-2c00504e09fa-etcd-ca\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707772 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-csi-data-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707834 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdfv\" (UniqueName: \"kubernetes.io/projected/0407e29c-77f3-481d-8642-91af970b9ef7-kube-api-access-thdfv\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707876 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707922 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e69422fb-c984-49c7-98db-a055b29fa457-auth-proxy-config\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707952 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4qxm\" (UniqueName: \"kubernetes.io/projected/830ca689-7f10-4899-b1fc-1d88feecf243-kube-api-access-q4qxm\") pod \"ingress-canary-nmgrc\" (UID: \"830ca689-7f10-4899-b1fc-1d88feecf243\") " pod="openshift-ingress-canary/ingress-canary-nmgrc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.707981 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-trusted-ca\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708003 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b089dbd-e3e7-457c-a781-92585ac48eb9-srv-cert\") pod \"olm-operator-6b444d44fb-2nltt\" (UID: \"7b089dbd-e3e7-457c-a781-92585ac48eb9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708026 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36722ee1-e3e3-4533-902a-038a215f40fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8nsh2\" (UID: \"36722ee1-e3e3-4533-902a-038a215f40fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708070 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-trusted-ca\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708100 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq47v\" (UniqueName: \"kubernetes.io/projected/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-kube-api-access-qq47v\") pod \"collect-profiles-29432700-jt6zc\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708158 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cxj5\" (UniqueName: \"kubernetes.io/projected/fb17d1b3-11c8-491b-bf10-35bd9142ad4c-kube-api-access-6cxj5\") pod \"openshift-controller-manager-operator-756b6f6bc6-bcz66\" (UID: \"fb17d1b3-11c8-491b-bf10-35bd9142ad4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708187 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5j4xq\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708228 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac56f58a-57ad-4d33-931d-2c00504e09fa-serving-cert\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708265 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a58e6e-ca3c-438f-b468-06d2f3ed7050-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wmgg7\" (UID: \"49a58e6e-ca3c-438f-b468-06d2f3ed7050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708879 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49a58e6e-ca3c-438f-b468-06d2f3ed7050-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wmgg7\" (UID: \"49a58e6e-ca3c-438f-b468-06d2f3ed7050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708945 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7b089dbd-e3e7-457c-a781-92585ac48eb9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2nltt\" (UID: \"7b089dbd-e3e7-457c-a781-92585ac48eb9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.708970 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-config-volume\") pod \"collect-profiles-29432700-jt6zc\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.740658 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.777795 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815210 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815567 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqft7\" (UniqueName: \"kubernetes.io/projected/8befec57-6330-4505-bdf6-f9fe6667a8bf-kube-api-access-jqft7\") pod \"service-ca-9c57cc56f-jwnkk\" (UID: \"8befec57-6330-4505-bdf6-f9fe6667a8bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815601 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9861106b-bcb8-49a2-93a3-14a548a26c57-service-ca-bundle\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: E1217 09:07:03.815641 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:04.315600461 +0000 UTC m=+143.975441224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815739 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhl94\" (UniqueName: \"kubernetes.io/projected/7b089dbd-e3e7-457c-a781-92585ac48eb9-kube-api-access-hhl94\") pod \"olm-operator-6b444d44fb-2nltt\" (UID: \"7b089dbd-e3e7-457c-a781-92585ac48eb9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815788 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-certificates\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815824 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8befec57-6330-4505-bdf6-f9fe6667a8bf-signing-key\") pod \"service-ca-9c57cc56f-jwnkk\" (UID: \"8befec57-6330-4505-bdf6-f9fe6667a8bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815847 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49a58e6e-ca3c-438f-b468-06d2f3ed7050-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wmgg7\" (UID: \"49a58e6e-ca3c-438f-b468-06d2f3ed7050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815878 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ac56f58a-57ad-4d33-931d-2c00504e09fa-etcd-ca\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815900 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2kk9\" (UniqueName: \"kubernetes.io/projected/2f54f534-8232-4771-a97b-5ce4f29b8a3d-kube-api-access-l2kk9\") pod \"dns-default-tdxp5\" (UID: \"2f54f534-8232-4771-a97b-5ce4f29b8a3d\") " pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815924 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-csi-data-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815951 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdfv\" (UniqueName: \"kubernetes.io/projected/0407e29c-77f3-481d-8642-91af970b9ef7-kube-api-access-thdfv\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815968 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e69422fb-c984-49c7-98db-a055b29fa457-auth-proxy-config\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.815992 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4qxm\" (UniqueName: \"kubernetes.io/projected/830ca689-7f10-4899-b1fc-1d88feecf243-kube-api-access-q4qxm\") pod \"ingress-canary-nmgrc\" (UID: \"830ca689-7f10-4899-b1fc-1d88feecf243\") " pod="openshift-ingress-canary/ingress-canary-nmgrc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816023 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-trusted-ca\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816049 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b089dbd-e3e7-457c-a781-92585ac48eb9-srv-cert\") pod \"olm-operator-6b444d44fb-2nltt\" (UID: \"7b089dbd-e3e7-457c-a781-92585ac48eb9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816068 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816096 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-trusted-ca\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816115 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36722ee1-e3e3-4533-902a-038a215f40fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8nsh2\" (UID: \"36722ee1-e3e3-4533-902a-038a215f40fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816136 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qq47v\" (UniqueName: \"kubernetes.io/projected/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-kube-api-access-qq47v\") pod \"collect-profiles-29432700-jt6zc\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816157 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e3ee8bd0-3faf-4c7f-bacc-022677b95019-certs\") pod \"machine-config-server-js55g\" (UID: \"e3ee8bd0-3faf-4c7f-bacc-022677b95019\") " pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816211 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cxj5\" (UniqueName: \"kubernetes.io/projected/fb17d1b3-11c8-491b-bf10-35bd9142ad4c-kube-api-access-6cxj5\") pod \"openshift-controller-manager-operator-756b6f6bc6-bcz66\" (UID: \"fb17d1b3-11c8-491b-bf10-35bd9142ad4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816229 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e3ee8bd0-3faf-4c7f-bacc-022677b95019-node-bootstrap-token\") pod \"machine-config-server-js55g\" (UID: \"e3ee8bd0-3faf-4c7f-bacc-022677b95019\") " pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816264 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac56f58a-57ad-4d33-931d-2c00504e09fa-serving-cert\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816301 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5j4xq\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816335 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a58e6e-ca3c-438f-b468-06d2f3ed7050-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wmgg7\" (UID: \"49a58e6e-ca3c-438f-b468-06d2f3ed7050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816357 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49a58e6e-ca3c-438f-b468-06d2f3ed7050-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wmgg7\" (UID: \"49a58e6e-ca3c-438f-b468-06d2f3ed7050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816407 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7b089dbd-e3e7-457c-a781-92585ac48eb9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2nltt\" (UID: \"7b089dbd-e3e7-457c-a781-92585ac48eb9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816425 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-config-volume\") pod \"collect-profiles-29432700-jt6zc\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816471 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816540 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-socket-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816573 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a03e09d-6bb6-4241-b1f7-24f864b05640-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k7vbl\" (UID: \"8a03e09d-6bb6-4241-b1f7-24f864b05640\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816604 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/50ad2e0b-8528-481e-af4f-f7343d62878a-srv-cert\") pod \"catalog-operator-68c6474976-k6mmw\" (UID: \"50ad2e0b-8528-481e-af4f-f7343d62878a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816633 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8befec57-6330-4505-bdf6-f9fe6667a8bf-signing-cabundle\") pod \"service-ca-9c57cc56f-jwnkk\" (UID: \"8befec57-6330-4505-bdf6-f9fe6667a8bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816653 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-485g5\" (UniqueName: \"kubernetes.io/projected/60032bf5-af40-4d89-a7e3-e2e8da6382a3-kube-api-access-485g5\") pod \"marketplace-operator-79b997595-5j4xq\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816670 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfrhf\" (UniqueName: \"kubernetes.io/projected/9861106b-bcb8-49a2-93a3-14a548a26c57-kube-api-access-tfrhf\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816703 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb17d1b3-11c8-491b-bf10-35bd9142ad4c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bcz66\" (UID: \"fb17d1b3-11c8-491b-bf10-35bd9142ad4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816720 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5j4xq\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816738 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-secret-volume\") pod \"collect-profiles-29432700-jt6zc\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816771 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/503b0e00-0b2a-4d33-a742-0dc3a2a73343-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bvl78\" (UID: \"503b0e00-0b2a-4d33-a742-0dc3a2a73343\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816789 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a03e09d-6bb6-4241-b1f7-24f864b05640-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k7vbl\" (UID: \"8a03e09d-6bb6-4241-b1f7-24f864b05640\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816818 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v5tz\" (UniqueName: \"kubernetes.io/projected/ac56f58a-57ad-4d33-931d-2c00504e09fa-kube-api-access-7v5tz\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816835 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac56f58a-57ad-4d33-931d-2c00504e09fa-etcd-client\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816881 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb17d1b3-11c8-491b-bf10-35bd9142ad4c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bcz66\" (UID: \"fb17d1b3-11c8-491b-bf10-35bd9142ad4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816900 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlbv\" (UniqueName: \"kubernetes.io/projected/50ad2e0b-8528-481e-af4f-f7343d62878a-kube-api-access-7rlbv\") pod \"catalog-operator-68c6474976-k6mmw\" (UID: \"50ad2e0b-8528-481e-af4f-f7343d62878a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816937 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/84312d40-3400-410d-9ba1-952f8ffbd442-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7j8g\" (UID: \"84312d40-3400-410d-9ba1-952f8ffbd442\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816959 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e69422fb-c984-49c7-98db-a055b29fa457-proxy-tls\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.816983 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lr22\" (UniqueName: \"kubernetes.io/projected/e3ee8bd0-3faf-4c7f-bacc-022677b95019-kube-api-access-2lr22\") pod \"machine-config-server-js55g\" (UID: \"e3ee8bd0-3faf-4c7f-bacc-022677b95019\") " pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817005 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36722ee1-e3e3-4533-902a-038a215f40fe-config\") pod \"kube-apiserver-operator-766d6c64bb-8nsh2\" (UID: \"36722ee1-e3e3-4533-902a-038a215f40fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817035 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f54f534-8232-4771-a97b-5ce4f29b8a3d-metrics-tls\") pod \"dns-default-tdxp5\" (UID: \"2f54f534-8232-4771-a97b-5ce4f29b8a3d\") " pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817057 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2aea1606-ff6f-4325-9f92-c83e2c5079c0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817074 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9861106b-bcb8-49a2-93a3-14a548a26c57-default-certificate\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817092 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f54f534-8232-4771-a97b-5ce4f29b8a3d-config-volume\") pod \"dns-default-tdxp5\" (UID: \"2f54f534-8232-4771-a97b-5ce4f29b8a3d\") " pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817124 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac56f58a-57ad-4d33-931d-2c00504e09fa-etcd-service-ca\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817143 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/830ca689-7f10-4899-b1fc-1d88feecf243-cert\") pod \"ingress-canary-nmgrc\" (UID: \"830ca689-7f10-4899-b1fc-1d88feecf243\") " pod="openshift-ingress-canary/ingress-canary-nmgrc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817167 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9861106b-bcb8-49a2-93a3-14a548a26c57-metrics-certs\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817184 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-plugins-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817203 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8jl\" (UniqueName: \"kubernetes.io/projected/e69422fb-c984-49c7-98db-a055b29fa457-kube-api-access-gh8jl\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817233 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2aea1606-ff6f-4325-9f92-c83e2c5079c0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817249 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-tls\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817492 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfhmx\" (UniqueName: \"kubernetes.io/projected/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-kube-api-access-vfhmx\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817518 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf4h7\" (UniqueName: \"kubernetes.io/projected/8a03e09d-6bb6-4241-b1f7-24f864b05640-kube-api-access-pf4h7\") pod \"kube-storage-version-migrator-operator-b67b599dd-k7vbl\" (UID: \"8a03e09d-6bb6-4241-b1f7-24f864b05640\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817557 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36722ee1-e3e3-4533-902a-038a215f40fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8nsh2\" (UID: \"36722ee1-e3e3-4533-902a-038a215f40fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817574 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9861106b-bcb8-49a2-93a3-14a548a26c57-stats-auth\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817591 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e69422fb-c984-49c7-98db-a055b29fa457-images\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817633 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44zp2\" (UniqueName: \"kubernetes.io/projected/84312d40-3400-410d-9ba1-952f8ffbd442-kube-api-access-44zp2\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7j8g\" (UID: \"84312d40-3400-410d-9ba1-952f8ffbd442\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817668 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-metrics-tls\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817737 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac56f58a-57ad-4d33-931d-2c00504e09fa-config\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817753 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-registration-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.817771 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vdnt\" (UniqueName: \"kubernetes.io/projected/888de074-122c-4990-8dc9-c1fa8b4361af-kube-api-access-6vdnt\") pod \"migrator-59844c95c7-nc7cp\" (UID: \"888de074-122c-4990-8dc9-c1fa8b4361af\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.818794 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ac56f58a-57ad-4d33-931d-2c00504e09fa-etcd-ca\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: E1217 09:07:03.819229 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:04.319211237 +0000 UTC m=+143.979052000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.820253 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36722ee1-e3e3-4533-902a-038a215f40fe-config\") pod \"kube-apiserver-operator-766d6c64bb-8nsh2\" (UID: \"36722ee1-e3e3-4533-902a-038a215f40fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.820442 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-plugins-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.821672 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac56f58a-57ad-4d33-931d-2c00504e09fa-serving-cert\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.821776 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb17d1b3-11c8-491b-bf10-35bd9142ad4c-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bcz66\" (UID: \"fb17d1b3-11c8-491b-bf10-35bd9142ad4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.822578 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-csi-data-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.822786 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49a58e6e-ca3c-438f-b468-06d2f3ed7050-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wmgg7\" (UID: \"49a58e6e-ca3c-438f-b468-06d2f3ed7050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.823004 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-registration-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.823960 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac56f58a-57ad-4d33-931d-2c00504e09fa-config\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.824483 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9861106b-bcb8-49a2-93a3-14a548a26c57-service-ca-bundle\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.825158 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-bound-sa-token\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.825196 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-mountpoint-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.825222 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/50ad2e0b-8528-481e-af4f-f7343d62878a-profile-collector-cert\") pod \"catalog-operator-68c6474976-k6mmw\" (UID: \"50ad2e0b-8528-481e-af4f-f7343d62878a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.825342 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dggwp\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-kube-api-access-dggwp\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.825374 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2bs\" (UniqueName: \"kubernetes.io/projected/503b0e00-0b2a-4d33-a742-0dc3a2a73343-kube-api-access-vh2bs\") pod \"multus-admission-controller-857f4d67dd-bvl78\" (UID: \"503b0e00-0b2a-4d33-a742-0dc3a2a73343\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.825756 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb17d1b3-11c8-491b-bf10-35bd9142ad4c-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bcz66\" (UID: \"fb17d1b3-11c8-491b-bf10-35bd9142ad4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.827137 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8befec57-6330-4505-bdf6-f9fe6667a8bf-signing-cabundle\") pod \"service-ca-9c57cc56f-jwnkk\" (UID: \"8befec57-6330-4505-bdf6-f9fe6667a8bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.834141 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-trusted-ca\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.836643 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36722ee1-e3e3-4533-902a-038a215f40fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8nsh2\" (UID: \"36722ee1-e3e3-4533-902a-038a215f40fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.836778 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8befec57-6330-4505-bdf6-f9fe6667a8bf-signing-key\") pod \"service-ca-9c57cc56f-jwnkk\" (UID: \"8befec57-6330-4505-bdf6-f9fe6667a8bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.837795 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e69422fb-c984-49c7-98db-a055b29fa457-images\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.838101 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-trusted-ca\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.839051 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5j4xq\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.839083 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2aea1606-ff6f-4325-9f92-c83e2c5079c0-ca-trust-extracted\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.839159 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-socket-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.840154 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49a58e6e-ca3c-438f-b468-06d2f3ed7050-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wmgg7\" (UID: \"49a58e6e-ca3c-438f-b468-06d2f3ed7050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.840723 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f54f534-8232-4771-a97b-5ce4f29b8a3d-config-volume\") pod \"dns-default-tdxp5\" (UID: \"2f54f534-8232-4771-a97b-5ce4f29b8a3d\") " pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.840820 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-certificates\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.841039 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e69422fb-c984-49c7-98db-a055b29fa457-auth-proxy-config\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.841167 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/50ad2e0b-8528-481e-af4f-f7343d62878a-srv-cert\") pod \"catalog-operator-68c6474976-k6mmw\" (UID: \"50ad2e0b-8528-481e-af4f-f7343d62878a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.841403 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/830ca689-7f10-4899-b1fc-1d88feecf243-cert\") pod \"ingress-canary-nmgrc\" (UID: \"830ca689-7f10-4899-b1fc-1d88feecf243\") " pod="openshift-ingress-canary/ingress-canary-nmgrc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.841685 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a03e09d-6bb6-4241-b1f7-24f864b05640-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-k7vbl\" (UID: \"8a03e09d-6bb6-4241-b1f7-24f864b05640\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.841799 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2aea1606-ff6f-4325-9f92-c83e2c5079c0-installation-pull-secrets\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.842503 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5j4xq\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.843799 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac56f58a-57ad-4d33-931d-2c00504e09fa-etcd-client\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.845403 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-secret-volume\") pod \"collect-profiles-29432700-jt6zc\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.845556 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-config-volume\") pod \"collect-profiles-29432700-jt6zc\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.846201 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac56f58a-57ad-4d33-931d-2c00504e09fa-etcd-service-ca\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.846499 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0407e29c-77f3-481d-8642-91af970b9ef7-mountpoint-dir\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.850182 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2f54f534-8232-4771-a97b-5ce4f29b8a3d-metrics-tls\") pod \"dns-default-tdxp5\" (UID: \"2f54f534-8232-4771-a97b-5ce4f29b8a3d\") " pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.855551 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e69422fb-c984-49c7-98db-a055b29fa457-proxy-tls\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.858003 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a03e09d-6bb6-4241-b1f7-24f864b05640-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-k7vbl\" (UID: \"8a03e09d-6bb6-4241-b1f7-24f864b05640\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.858717 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqft7\" (UniqueName: \"kubernetes.io/projected/8befec57-6330-4505-bdf6-f9fe6667a8bf-kube-api-access-jqft7\") pod \"service-ca-9c57cc56f-jwnkk\" (UID: \"8befec57-6330-4505-bdf6-f9fe6667a8bf\") " pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.858959 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9861106b-bcb8-49a2-93a3-14a548a26c57-default-certificate\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.859456 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/50ad2e0b-8528-481e-af4f-f7343d62878a-profile-collector-cert\") pod \"catalog-operator-68c6474976-k6mmw\" (UID: \"50ad2e0b-8528-481e-af4f-f7343d62878a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.860164 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-tls\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.860570 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/84312d40-3400-410d-9ba1-952f8ffbd442-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7j8g\" (UID: \"84312d40-3400-410d-9ba1-952f8ffbd442\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.861112 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/503b0e00-0b2a-4d33-a742-0dc3a2a73343-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-bvl78\" (UID: \"503b0e00-0b2a-4d33-a742-0dc3a2a73343\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.865035 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-metrics-tls\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.865702 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9861106b-bcb8-49a2-93a3-14a548a26c57-metrics-certs\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.866459 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b089dbd-e3e7-457c-a781-92585ac48eb9-srv-cert\") pod \"olm-operator-6b444d44fb-2nltt\" (UID: \"7b089dbd-e3e7-457c-a781-92585ac48eb9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.873216 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7b089dbd-e3e7-457c-a781-92585ac48eb9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2nltt\" (UID: \"7b089dbd-e3e7-457c-a781-92585ac48eb9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.876295 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9861106b-bcb8-49a2-93a3-14a548a26c57-stats-auth\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.884748 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2"] Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.885447 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cxj5\" (UniqueName: \"kubernetes.io/projected/fb17d1b3-11c8-491b-bf10-35bd9142ad4c-kube-api-access-6cxj5\") pod \"openshift-controller-manager-operator-756b6f6bc6-bcz66\" (UID: \"fb17d1b3-11c8-491b-bf10-35bd9142ad4c\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.897513 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhl94\" (UniqueName: \"kubernetes.io/projected/7b089dbd-e3e7-457c-a781-92585ac48eb9-kube-api-access-hhl94\") pod \"olm-operator-6b444d44fb-2nltt\" (UID: \"7b089dbd-e3e7-457c-a781-92585ac48eb9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.915175 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-485g5\" (UniqueName: \"kubernetes.io/projected/60032bf5-af40-4d89-a7e3-e2e8da6382a3-kube-api-access-485g5\") pod \"marketplace-operator-79b997595-5j4xq\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.922838 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.926905 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.927163 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lr22\" (UniqueName: \"kubernetes.io/projected/e3ee8bd0-3faf-4c7f-bacc-022677b95019-kube-api-access-2lr22\") pod \"machine-config-server-js55g\" (UID: \"e3ee8bd0-3faf-4c7f-bacc-022677b95019\") " pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.927332 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e3ee8bd0-3faf-4c7f-bacc-022677b95019-certs\") pod \"machine-config-server-js55g\" (UID: \"e3ee8bd0-3faf-4c7f-bacc-022677b95019\") " pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.927353 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e3ee8bd0-3faf-4c7f-bacc-022677b95019-node-bootstrap-token\") pod \"machine-config-server-js55g\" (UID: \"e3ee8bd0-3faf-4c7f-bacc-022677b95019\") " pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:03 crc kubenswrapper[4935]: E1217 09:07:03.929400 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:04.429375106 +0000 UTC m=+144.089215869 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.929985 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.944092 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e3ee8bd0-3faf-4c7f-bacc-022677b95019-certs\") pod \"machine-config-server-js55g\" (UID: \"e3ee8bd0-3faf-4c7f-bacc-022677b95019\") " pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.944454 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49a58e6e-ca3c-438f-b468-06d2f3ed7050-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wmgg7\" (UID: \"49a58e6e-ca3c-438f-b468-06d2f3ed7050\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.945170 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e3ee8bd0-3faf-4c7f-bacc-022677b95019-node-bootstrap-token\") pod \"machine-config-server-js55g\" (UID: \"e3ee8bd0-3faf-4c7f-bacc-022677b95019\") " pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.970466 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36722ee1-e3e3-4533-902a-038a215f40fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8nsh2\" (UID: \"36722ee1-e3e3-4533-902a-038a215f40fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.991009 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf4h7\" (UniqueName: \"kubernetes.io/projected/8a03e09d-6bb6-4241-b1f7-24f864b05640-kube-api-access-pf4h7\") pod \"kube-storage-version-migrator-operator-b67b599dd-k7vbl\" (UID: \"8a03e09d-6bb6-4241-b1f7-24f864b05640\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.994307 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" Dec 17 09:07:03 crc kubenswrapper[4935]: I1217 09:07:03.997841 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlbv\" (UniqueName: \"kubernetes.io/projected/50ad2e0b-8528-481e-af4f-f7343d62878a-kube-api-access-7rlbv\") pod \"catalog-operator-68c6474976-k6mmw\" (UID: \"50ad2e0b-8528-481e-af4f-f7343d62878a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.026185 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44zp2\" (UniqueName: \"kubernetes.io/projected/84312d40-3400-410d-9ba1-952f8ffbd442-kube-api-access-44zp2\") pod \"control-plane-machine-set-operator-78cbb6b69f-d7j8g\" (UID: \"84312d40-3400-410d-9ba1-952f8ffbd442\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.029288 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.029935 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:04.529913751 +0000 UTC m=+144.189754514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.046941 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.055982 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2bs\" (UniqueName: \"kubernetes.io/projected/503b0e00-0b2a-4d33-a742-0dc3a2a73343-kube-api-access-vh2bs\") pod \"multus-admission-controller-857f4d67dd-bvl78\" (UID: \"503b0e00-0b2a-4d33-a742-0dc3a2a73343\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.061960 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.068399 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.090676 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.095553 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.099925 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh8jl\" (UniqueName: \"kubernetes.io/projected/e69422fb-c984-49c7-98db-a055b29fa457-kube-api-access-gh8jl\") pod \"machine-config-operator-74547568cd-z5h6w\" (UID: \"e69422fb-c984-49c7-98db-a055b29fa457\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.109315 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfrhf\" (UniqueName: \"kubernetes.io/projected/9861106b-bcb8-49a2-93a3-14a548a26c57-kube-api-access-tfrhf\") pod \"router-default-5444994796-gtvxm\" (UID: \"9861106b-bcb8-49a2-93a3-14a548a26c57\") " pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.118335 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2kk9\" (UniqueName: \"kubernetes.io/projected/2f54f534-8232-4771-a97b-5ce4f29b8a3d-kube-api-access-l2kk9\") pod \"dns-default-tdxp5\" (UID: \"2f54f534-8232-4771-a97b-5ce4f29b8a3d\") " pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.123081 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.130233 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.130831 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:04.630809235 +0000 UTC m=+144.290649988 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.141811 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vdnt\" (UniqueName: \"kubernetes.io/projected/888de074-122c-4990-8dc9-c1fa8b4361af-kube-api-access-6vdnt\") pod \"migrator-59844c95c7-nc7cp\" (UID: \"888de074-122c-4990-8dc9-c1fa8b4361af\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.143398 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.152019 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-nw6k6"] Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.159226 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.171500 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq47v\" (UniqueName: \"kubernetes.io/projected/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-kube-api-access-qq47v\") pod \"collect-profiles-29432700-jt6zc\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.193162 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4qxm\" (UniqueName: \"kubernetes.io/projected/830ca689-7f10-4899-b1fc-1d88feecf243-kube-api-access-q4qxm\") pod \"ingress-canary-nmgrc\" (UID: \"830ca689-7f10-4899-b1fc-1d88feecf243\") " pod="openshift-ingress-canary/ingress-canary-nmgrc" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.203211 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.213611 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.215334 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdfv\" (UniqueName: \"kubernetes.io/projected/0407e29c-77f3-481d-8642-91af970b9ef7-kube-api-access-thdfv\") pod \"csi-hostpathplugin-79ntp\" (UID: \"0407e29c-77f3-481d-8642-91af970b9ef7\") " pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.234082 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.234599 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:04.734582125 +0000 UTC m=+144.394422878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.239008 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-bound-sa-token\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.244884 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.254202 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfhmx\" (UniqueName: \"kubernetes.io/projected/05f18f0f-eb23-4bc8-ab78-9ff1c84d825d-kube-api-access-vfhmx\") pod \"ingress-operator-5b745b69d9-gx689\" (UID: \"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.304361 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v5tz\" (UniqueName: \"kubernetes.io/projected/ac56f58a-57ad-4d33-931d-2c00504e09fa-kube-api-access-7v5tz\") pod \"etcd-operator-b45778765-mqg4s\" (UID: \"ac56f58a-57ad-4d33-931d-2c00504e09fa\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.313313 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.321247 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.322140 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lr22\" (UniqueName: \"kubernetes.io/projected/e3ee8bd0-3faf-4c7f-bacc-022677b95019-kube-api-access-2lr22\") pod \"machine-config-server-js55g\" (UID: \"e3ee8bd0-3faf-4c7f-bacc-022677b95019\") " pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.332234 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.337599 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6"] Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.341505 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.341854 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:04.841835248 +0000 UTC m=+144.501676011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.358717 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.361426 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dggwp\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-kube-api-access-dggwp\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.415567 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-79ntp" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.435757 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-l9zg8" event={"ID":"02137895-2767-439d-af36-1b95bb61aaeb","Type":"ContainerStarted","Data":"cb5ca9761d5be6994f631af0e417c2181ce4041dc34a28bd6be527418e03270c"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.435818 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-l9zg8" event={"ID":"02137895-2767-439d-af36-1b95bb61aaeb","Type":"ContainerStarted","Data":"f0abc108cd74819b2753e64d777c719183e6fcd998ab368e00f628a81bf84596"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.438114 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.438509 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-nmgrc" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.440542 4935 patch_prober.go:28] interesting pod/console-operator-58897d9998-l9zg8 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.440605 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-l9zg8" podUID="02137895-2767-439d-af36-1b95bb61aaeb" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.452537 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.453034 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:04.953015865 +0000 UTC m=+144.612856628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.474977 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" event={"ID":"9c9294b4-aa19-4670-b826-77b9641ea149","Type":"ContainerStarted","Data":"f7cc7e12b619d87fb313e9a9b91ca75d643794b5dea04a87bc024010eed3b535"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.475412 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.495607 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" event={"ID":"771b6836-1fda-49f0-b7aa-2c65f7e81dad","Type":"ContainerStarted","Data":"941b7c9c0f398298ec68b4df60cf4c896246df1108288b117586d7470bff7a6b"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.495664 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" event={"ID":"771b6836-1fda-49f0-b7aa-2c65f7e81dad","Type":"ContainerStarted","Data":"3ecaabac527c7f5c25fed98093dd67619edacdec6962d7cbc496b74074b7ec43"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.501446 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" event={"ID":"26bdd534-df87-4879-b036-377d8c606d5c","Type":"ContainerStarted","Data":"c0794dc32274d70ec4c1a9de99c62cace42dc934abe8beed385fa404325a0dc3"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.501513 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" event={"ID":"26bdd534-df87-4879-b036-377d8c606d5c","Type":"ContainerStarted","Data":"1c4f341892168f24ebe4ded92d5457f369d864adb39bf500637722eae2687ab3"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.506391 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" event={"ID":"722d5ccb-2d22-453f-b6d0-8eca00275efb","Type":"ContainerStarted","Data":"ecb96e8b70a9dd065681414ef3ecf7a8ba6cd37bdd754730b9e9fe51c87ea724"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.506428 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" event={"ID":"722d5ccb-2d22-453f-b6d0-8eca00275efb","Type":"ContainerStarted","Data":"2a20bcaf6d70ff41c7d9600c9eeafc3d7c32d24db2b65f6e301db9da889e8476"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.511925 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-29mfs" event={"ID":"618adccc-479a-43b8-a44f-eb62ce26108a","Type":"ContainerStarted","Data":"95aadc4ce99b416da1d0b8ac95cfe2e15c55c702e8fcf0f392c42cbcb8d613fd"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.512022 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-29mfs" event={"ID":"618adccc-479a-43b8-a44f-eb62ce26108a","Type":"ContainerStarted","Data":"f69f14eeb0481b132e79cc97b9eb2df30d1098bdfc01fb06b170658579b774d0"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.512203 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-29mfs" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.516375 4935 patch_prober.go:28] interesting pod/downloads-7954f5f757-29mfs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.516443 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-29mfs" podUID="618adccc-479a-43b8-a44f-eb62ce26108a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.517811 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nw6k6" event={"ID":"a074b884-bf31-47dc-9257-41a7d4dda13e","Type":"ContainerStarted","Data":"380aa846562ba6cde2a760c3333d15d04b8441abbdcf855ed87b9775d4b7f36a"} Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.553894 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.554023 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.05395784 +0000 UTC m=+144.713798603 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.554827 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.555896 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.055869851 +0000 UTC m=+144.715710614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.563258 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-js55g" Dec 17 09:07:04 crc kubenswrapper[4935]: W1217 09:07:04.587197 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9861106b_bcb8_49a2_93a3_14a548a26c57.slice/crio-6eb84061232f4f304003067a0513a3f9046922ae1ea8bbaa6d7d3dd06acf05db WatchSource:0}: Error finding container 6eb84061232f4f304003067a0513a3f9046922ae1ea8bbaa6d7d3dd06acf05db: Status 404 returned error can't find the container with id 6eb84061232f4f304003067a0513a3f9046922ae1ea8bbaa6d7d3dd06acf05db Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.656418 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.659472 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.159441336 +0000 UTC m=+144.819282099 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: W1217 09:07:04.661579 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ee8bd0_3faf_4c7f_bacc_022677b95019.slice/crio-93fef295066298d5901d4ec676dccfc8e2e07089b55485ad0fbc8b9e7252d2fa WatchSource:0}: Error finding container 93fef295066298d5901d4ec676dccfc8e2e07089b55485ad0fbc8b9e7252d2fa: Status 404 returned error can't find the container with id 93fef295066298d5901d4ec676dccfc8e2e07089b55485ad0fbc8b9e7252d2fa Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.761947 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.763216 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.263157755 +0000 UTC m=+144.922998508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.799043 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw"] Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.864250 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.864554 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.364506521 +0000 UTC m=+145.024347284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.864789 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.865199 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.365182949 +0000 UTC m=+145.025023712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:04 crc kubenswrapper[4935]: I1217 09:07:04.967162 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:04 crc kubenswrapper[4935]: E1217 09:07:04.967645 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.467626314 +0000 UTC m=+145.127467077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.012799 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v725p"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.034795 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" podStartSLOduration=122.034770314 podStartE2EDuration="2m2.034770314s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:05.017799674 +0000 UTC m=+144.677640447" watchObservedRunningTime="2025-12-17 09:07:05.034770314 +0000 UTC m=+144.694611087" Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.036234 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.069432 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:05 crc kubenswrapper[4935]: E1217 09:07:05.069859 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.569844004 +0000 UTC m=+145.229684767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: W1217 09:07:05.124033 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eba7d82_1203_49c2_bc66_7c77d2298a0d.slice/crio-1f9ba34f1761e3976b9b9947e7172a2787884016d9f08195273d4308601b8450 WatchSource:0}: Error finding container 1f9ba34f1761e3976b9b9947e7172a2787884016d9f08195273d4308601b8450: Status 404 returned error can't find the container with id 1f9ba34f1761e3976b9b9947e7172a2787884016d9f08195273d4308601b8450 Dec 17 09:07:05 crc kubenswrapper[4935]: W1217 09:07:05.126934 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf337d441_0527_46d0_98f4_a9323a682482.slice/crio-e909646f1e2c6cdd3301255c3e4eda1e7abc21113e08b0e88d9479fce5344f02 WatchSource:0}: Error finding container e909646f1e2c6cdd3301255c3e4eda1e7abc21113e08b0e88d9479fce5344f02: Status 404 returned error can't find the container with id e909646f1e2c6cdd3301255c3e4eda1e7abc21113e08b0e88d9479fce5344f02 Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.177099 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:05 crc kubenswrapper[4935]: E1217 09:07:05.177565 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.677543468 +0000 UTC m=+145.337384231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.275105 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hq8r6"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.279942 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:05 crc kubenswrapper[4935]: E1217 09:07:05.280973 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.780956749 +0000 UTC m=+145.440797512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.281010 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ckj77"] Dec 17 09:07:05 crc kubenswrapper[4935]: W1217 09:07:05.294656 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a47edc9_43df_40ba_8b92_80205500df3c.slice/crio-1bab2f96c01fe222b935de5679dbfa1530534eb0f3c1983d1c4ac2799a11ee88 WatchSource:0}: Error finding container 1bab2f96c01fe222b935de5679dbfa1530534eb0f3c1983d1c4ac2799a11ee88: Status 404 returned error can't find the container with id 1bab2f96c01fe222b935de5679dbfa1530534eb0f3c1983d1c4ac2799a11ee88 Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.383796 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:05 crc kubenswrapper[4935]: E1217 09:07:05.384199 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.884183244 +0000 UTC m=+145.544024007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.486053 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:05 crc kubenswrapper[4935]: E1217 09:07:05.489210 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:05.989184137 +0000 UTC m=+145.649024900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.555384 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.584386 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-js55g" event={"ID":"e3ee8bd0-3faf-4c7f-bacc-022677b95019","Type":"ContainerStarted","Data":"6ad1e1f364c4f98fd9e77b3f97f608b09d4e4ac17453d6cba01fd308c194b9ac"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.584447 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-js55g" event={"ID":"e3ee8bd0-3faf-4c7f-bacc-022677b95019","Type":"ContainerStarted","Data":"93fef295066298d5901d4ec676dccfc8e2e07089b55485ad0fbc8b9e7252d2fa"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.590110 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:05 crc kubenswrapper[4935]: E1217 09:07:05.590644 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:06.090619506 +0000 UTC m=+145.750460269 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.605507 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gtvxm" event={"ID":"9861106b-bcb8-49a2-93a3-14a548a26c57","Type":"ContainerStarted","Data":"5949f3c074fb46c0ec8c62a8436299c9f284e79530fbe36bf042389073a327f7"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.605573 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gtvxm" event={"ID":"9861106b-bcb8-49a2-93a3-14a548a26c57","Type":"ContainerStarted","Data":"6eb84061232f4f304003067a0513a3f9046922ae1ea8bbaa6d7d3dd06acf05db"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.608898 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.627482 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" event={"ID":"acc8184c-2eca-45cf-acc1-53501bde4e00","Type":"ContainerStarted","Data":"ab5bca035241de533f5297b4b8dcdba67518525bdf51af7bceb6245b16703d7a"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.627538 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" event={"ID":"acc8184c-2eca-45cf-acc1-53501bde4e00","Type":"ContainerStarted","Data":"adaae24bea64646ac046894a596dd43da318ef924b3aedafd601c68f5982caf9"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.629112 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.640293 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" event={"ID":"9c9294b4-aa19-4670-b826-77b9641ea149","Type":"ContainerStarted","Data":"005505f854386dc274a51747b49cb42a9d85a2ebd33224c5a2d4f42e216f593e"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.640362 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" event={"ID":"9c9294b4-aa19-4670-b826-77b9641ea149","Type":"ContainerStarted","Data":"b5a5df0ae10aba4e0ccd7dcc3b212c6919a71a8b260e19da923c3e82789b04fe"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.642118 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.642162 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" event={"ID":"0eba7d82-1203-49c2-bc66-7c77d2298a0d","Type":"ContainerStarted","Data":"1f9ba34f1761e3976b9b9947e7172a2787884016d9f08195273d4308601b8450"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.646501 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" event={"ID":"336946bd-b29f-44dd-a54e-a9a023684726","Type":"ContainerStarted","Data":"ed4020d6555573bcf254291861ab10c360f34f954173bf05e99819eaf459f132"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.646558 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" event={"ID":"336946bd-b29f-44dd-a54e-a9a023684726","Type":"ContainerStarted","Data":"51077af981fa65977aab9f6962a18d4cba48137aaedb47abe5df9a5d2a19a139"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.647499 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.647561 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.661751 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.666500 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4xq"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.668890 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.671715 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-84n6g"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.675306 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.675433 4935 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tfngw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.675485 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" podUID="336946bd-b29f-44dd-a54e-a9a023684726" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.675608 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" event={"ID":"6a47edc9-43df-40ba-8b92-80205500df3c","Type":"ContainerStarted","Data":"1bab2f96c01fe222b935de5679dbfa1530534eb0f3c1983d1c4ac2799a11ee88"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.691670 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:05 crc kubenswrapper[4935]: E1217 09:07:05.693766 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:06.193752129 +0000 UTC m=+145.853592892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.696705 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" event={"ID":"b44f7e9d-3f08-48de-a465-056daf8a4549","Type":"ContainerStarted","Data":"13fe02e4e1ae082f92f0147eddfc74f2791b00c51fcbdeaa14e7a48bffbc6244"} Dec 17 09:07:05 crc kubenswrapper[4935]: W1217 09:07:05.720830 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89340f7a_cc56_44ef_8fbf_0e4efd9ce27e.slice/crio-926815ffbef8feb894efde84385cf53f2a306021e2c9496694b18c009c80c17b WatchSource:0}: Error finding container 926815ffbef8feb894efde84385cf53f2a306021e2c9496694b18c009c80c17b: Status 404 returned error can't find the container with id 926815ffbef8feb894efde84385cf53f2a306021e2c9496694b18c009c80c17b Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.726767 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" event={"ID":"26bdd534-df87-4879-b036-377d8c606d5c","Type":"ContainerStarted","Data":"7be2e00d181f9064727dde9287bb0535a8187fe2b1590d70b5c9863620c32270"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.773324 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-29mfs" podStartSLOduration=122.773293578 podStartE2EDuration="2m2.773293578s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:05.767254788 +0000 UTC m=+145.427095551" watchObservedRunningTime="2025-12-17 09:07:05.773293578 +0000 UTC m=+145.433134331" Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.775175 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nw6k6" event={"ID":"a074b884-bf31-47dc-9257-41a7d4dda13e","Type":"ContainerStarted","Data":"435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.791530 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" event={"ID":"f337d441-0527-46d0-98f4-a9323a682482","Type":"ContainerStarted","Data":"e909646f1e2c6cdd3301255c3e4eda1e7abc21113e08b0e88d9479fce5344f02"} Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.791633 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.794923 4935 patch_prober.go:28] interesting pod/downloads-7954f5f757-29mfs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.795051 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.794971 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-29mfs" podUID="618adccc-479a-43b8-a44f-eb62ce26108a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 17 09:07:05 crc kubenswrapper[4935]: E1217 09:07:05.795343 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:06.295317662 +0000 UTC m=+145.955158425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.795594 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:05 crc kubenswrapper[4935]: E1217 09:07:05.799351 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:06.299327868 +0000 UTC m=+145.959168691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.808531 4935 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-v725p container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.808842 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" podUID="f337d441-0527-46d0-98f4-a9323a682482" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.809259 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-b2nl7" podStartSLOduration=121.809238991 podStartE2EDuration="2m1.809238991s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:05.80771604 +0000 UTC m=+145.467556803" watchObservedRunningTime="2025-12-17 09:07:05.809238991 +0000 UTC m=+145.469079754" Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.841361 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.894025 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gtvxm" podStartSLOduration=121.893998317 podStartE2EDuration="2m1.893998317s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:05.892256571 +0000 UTC m=+145.552097334" watchObservedRunningTime="2025-12-17 09:07:05.893998317 +0000 UTC m=+145.553839080" Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.894317 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tdxp5"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.898039 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:05 crc kubenswrapper[4935]: E1217 09:07:05.898918 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:06.398897357 +0000 UTC m=+146.058738110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.932581 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-79ntp"] Dec 17 09:07:05 crc kubenswrapper[4935]: I1217 09:07:05.972140 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-l9zg8" podStartSLOduration=122.972117517 podStartE2EDuration="2m2.972117517s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:05.965528372 +0000 UTC m=+145.625369135" watchObservedRunningTime="2025-12-17 09:07:05.972117517 +0000 UTC m=+145.631958280" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.012525 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.013023 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:06.513006281 +0000 UTC m=+146.172847044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.013068 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mqg4s"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.043596 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.052593 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.075359 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-jwnkk"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.079674 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.087051 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-l9zg8" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.091680 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.092318 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-pfnv2" podStartSLOduration=123.092305163 podStartE2EDuration="2m3.092305163s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:06.043485989 +0000 UTC m=+145.703326752" watchObservedRunningTime="2025-12-17 09:07:06.092305163 +0000 UTC m=+145.752145946" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.112509 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.113357 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.113489 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7"] Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.113815 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:06.613798292 +0000 UTC m=+146.273639055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.115117 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-nmgrc"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.122863 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" podStartSLOduration=123.122839802 podStartE2EDuration="2m3.122839802s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:06.085249226 +0000 UTC m=+145.745089999" watchObservedRunningTime="2025-12-17 09:07:06.122839802 +0000 UTC m=+145.782680565" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.125385 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-bvl78"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.127325 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gx689"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.128422 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" podStartSLOduration=122.12841041 podStartE2EDuration="2m2.12841041s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:06.106840328 +0000 UTC m=+145.766681201" watchObservedRunningTime="2025-12-17 09:07:06.12841041 +0000 UTC m=+145.788251173" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.146208 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-js55g" podStartSLOduration=5.14617265 podStartE2EDuration="5.14617265s" podCreationTimestamp="2025-12-17 09:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:06.129463118 +0000 UTC m=+145.789303881" watchObservedRunningTime="2025-12-17 09:07:06.14617265 +0000 UTC m=+145.806013413" Dec 17 09:07:06 crc kubenswrapper[4935]: W1217 09:07:06.149070 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84312d40_3400_410d_9ba1_952f8ffbd442.slice/crio-3fcc18b9cb0c8197fffa39ee9724067ddc4d1499f35f5a5cdae8f1faafee15e7 WatchSource:0}: Error finding container 3fcc18b9cb0c8197fffa39ee9724067ddc4d1499f35f5a5cdae8f1faafee15e7: Status 404 returned error can't find the container with id 3fcc18b9cb0c8197fffa39ee9724067ddc4d1499f35f5a5cdae8f1faafee15e7 Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.175651 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hmdsg" podStartSLOduration=123.175621821 podStartE2EDuration="2m3.175621821s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:06.170898756 +0000 UTC m=+145.830739519" watchObservedRunningTime="2025-12-17 09:07:06.175621821 +0000 UTC m=+145.835462584" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.182963 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp"] Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.190798 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-vkgt6" podStartSLOduration=123.190777593 podStartE2EDuration="2m3.190777593s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:06.189402366 +0000 UTC m=+145.849243139" watchObservedRunningTime="2025-12-17 09:07:06.190777593 +0000 UTC m=+145.850618346" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.218228 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.218697 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:06.718684042 +0000 UTC m=+146.378524805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: W1217 09:07:06.220965 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36722ee1_e3e3_4533_902a_038a215f40fe.slice/crio-a8eae8be7c3c11db45432369ce495b6260b7775791b1cee1d1cf41c2515c3794 WatchSource:0}: Error finding container a8eae8be7c3c11db45432369ce495b6260b7775791b1cee1d1cf41c2515c3794: Status 404 returned error can't find the container with id a8eae8be7c3c11db45432369ce495b6260b7775791b1cee1d1cf41c2515c3794 Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.225428 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-nw6k6" podStartSLOduration=123.225409461 podStartE2EDuration="2m3.225409461s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:06.225237466 +0000 UTC m=+145.885078229" watchObservedRunningTime="2025-12-17 09:07:06.225409461 +0000 UTC m=+145.885250224" Dec 17 09:07:06 crc kubenswrapper[4935]: W1217 09:07:06.230042 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod503b0e00_0b2a_4d33_a742_0dc3a2a73343.slice/crio-ef6601e9adb7352f8ab353efd52088244f0f0dfa9e651578beedb53c2bb0bbd6 WatchSource:0}: Error finding container ef6601e9adb7352f8ab353efd52088244f0f0dfa9e651578beedb53c2bb0bbd6: Status 404 returned error can't find the container with id ef6601e9adb7352f8ab353efd52088244f0f0dfa9e651578beedb53c2bb0bbd6 Dec 17 09:07:06 crc kubenswrapper[4935]: W1217 09:07:06.272796 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod888de074_122c_4990_8dc9_c1fa8b4361af.slice/crio-e09078bbf7438c4126f8ff03a7d0d26e2584e133c78000d2fc32b45b402d3a65 WatchSource:0}: Error finding container e09078bbf7438c4126f8ff03a7d0d26e2584e133c78000d2fc32b45b402d3a65: Status 404 returned error can't find the container with id e09078bbf7438c4126f8ff03a7d0d26e2584e133c78000d2fc32b45b402d3a65 Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.319138 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.319572 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:06.819553266 +0000 UTC m=+146.479394029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.334853 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.342492 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:06 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:06 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:06 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.342555 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.430242 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.431716 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:06.931691218 +0000 UTC m=+146.591531981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.531557 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.531990 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.031967546 +0000 UTC m=+146.691808299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.633026 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.633796 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.133771444 +0000 UTC m=+146.793612207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.735619 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.736174 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.236155538 +0000 UTC m=+146.895996301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.843640 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.844127 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.344099389 +0000 UTC m=+147.003940152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.852161 4935 generic.go:334] "Generic (PLEG): container finished" podID="b44f7e9d-3f08-48de-a465-056daf8a4549" containerID="cb1edca177b1bbce1eb333a134dc687783b9f55990606c4a61900e0158862710" exitCode=0 Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.852422 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" event={"ID":"b44f7e9d-3f08-48de-a465-056daf8a4549","Type":"ContainerDied","Data":"cb1edca177b1bbce1eb333a134dc687783b9f55990606c4a61900e0158862710"} Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.864737 4935 generic.go:334] "Generic (PLEG): container finished" podID="0eba7d82-1203-49c2-bc66-7c77d2298a0d" containerID="5ec17e208eb1a62bd25208c98a72b4158882c82bbe196e3da8ce1ab1bee595fb" exitCode=0 Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.864860 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" event={"ID":"0eba7d82-1203-49c2-bc66-7c77d2298a0d","Type":"ContainerDied","Data":"5ec17e208eb1a62bd25208c98a72b4158882c82bbe196e3da8ce1ab1bee595fb"} Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.874987 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-79ntp" event={"ID":"0407e29c-77f3-481d-8642-91af970b9ef7","Type":"ContainerStarted","Data":"88bc2643a26d46d65a54ffd28a11a5a961fd1ed6d2720a811ff1b9de1d5543d7"} Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.879582 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" podStartSLOduration=123.879561928 podStartE2EDuration="2m3.879561928s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:06.291488352 +0000 UTC m=+145.951329115" watchObservedRunningTime="2025-12-17 09:07:06.879561928 +0000 UTC m=+146.539402691" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.905587 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" event={"ID":"8befec57-6330-4505-bdf6-f9fe6667a8bf","Type":"ContainerStarted","Data":"1f664fa9e08a5cdc89f1eedd25c784e1669069258838fa9b5c263d7fded52c44"} Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.932582 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" event={"ID":"cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b","Type":"ContainerStarted","Data":"d1b30dd578873fffabbb1e3682d425b5ca210c773186e5bdae9c82c722f622a8"} Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.932638 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" event={"ID":"cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b","Type":"ContainerStarted","Data":"76e5fbbfff371f50582561d3d6b205d2a77d47c710ccaf5e85524d0cc64efcca"} Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.947755 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.947942 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.44790828 +0000 UTC m=+147.107749053 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.948191 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.949442 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" event={"ID":"7b089dbd-e3e7-457c-a781-92585ac48eb9","Type":"ContainerStarted","Data":"1a0ac238455189d193d136b9f275ed2c8090710b3e95faca6b1a0f25fa8cf15d"} Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.949511 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" event={"ID":"7b089dbd-e3e7-457c-a781-92585ac48eb9","Type":"ContainerStarted","Data":"c2ffea6df31ebec7bbf3a3cf35a78f5359e1d3dfec82fa0d8795d3cb3c13b1b4"} Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.950041 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:06 crc kubenswrapper[4935]: E1217 09:07:06.950540 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.45053083 +0000 UTC m=+147.110371593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.968532 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tdxp5" event={"ID":"2f54f534-8232-4771-a97b-5ce4f29b8a3d","Type":"ContainerStarted","Data":"63c596c9ec98f418bf241303f32be26c4637df54c674fd4330a3bdca32a39ced"} Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.979830 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" event={"ID":"6a47edc9-43df-40ba-8b92-80205500df3c","Type":"ContainerStarted","Data":"0e8bbf07030e056ce7ec344f2175bdfbd7a8abfcb45f7cd1cb5a566ec9d87504"} Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.988626 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" Dec 17 09:07:06 crc kubenswrapper[4935]: I1217 09:07:06.988883 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2nltt" podStartSLOduration=122.988863925 podStartE2EDuration="2m2.988863925s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:06.986541854 +0000 UTC m=+146.646382617" watchObservedRunningTime="2025-12-17 09:07:06.988863925 +0000 UTC m=+146.648704688" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.033479 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" event={"ID":"ca8591f8-8280-436e-a261-35cf66909b26","Type":"ContainerStarted","Data":"d6f068a0221b7b9d3a1d94adfd11e84ea019f17cbf3431eab0676d516298b432"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.033526 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" event={"ID":"ca8591f8-8280-436e-a261-35cf66909b26","Type":"ContainerStarted","Data":"2a6167703e2ec7cf56d5348996e52ac2c4da5752bacb96cb83888610a62fccd4"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.050843 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:07 crc kubenswrapper[4935]: E1217 09:07:07.053715 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.553692734 +0000 UTC m=+147.213533497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.094854 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nmgrc" event={"ID":"830ca689-7f10-4899-b1fc-1d88feecf243","Type":"ContainerStarted","Data":"9e792264cc6d33672fb26b999778061f81eaf5cd12a55c9f593bb3527638ea09"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.138727 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-nmgrc" podStartSLOduration=6.138697836 podStartE2EDuration="6.138697836s" podCreationTimestamp="2025-12-17 09:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:07.133883029 +0000 UTC m=+146.793723792" watchObservedRunningTime="2025-12-17 09:07:07.138697836 +0000 UTC m=+146.798538599" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.139105 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-84n6g" podStartSLOduration=123.139100558 podStartE2EDuration="2m3.139100558s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:07.074840215 +0000 UTC m=+146.734680978" watchObservedRunningTime="2025-12-17 09:07:07.139100558 +0000 UTC m=+146.798941311" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.153033 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" event={"ID":"fb17d1b3-11c8-491b-bf10-35bd9142ad4c","Type":"ContainerStarted","Data":"d1273a5405f1e2ffc52ab08a85d7b31d79dea10d19a3c61bc1de3f52eb8efc63"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.153359 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:07 crc kubenswrapper[4935]: E1217 09:07:07.153768 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.653748096 +0000 UTC m=+147.313588869 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.157353 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" event={"ID":"8823e665-73c6-4f33-a6c2-18a8a750abb9","Type":"ContainerStarted","Data":"062be1f48619d15ecf1ec01d58d7bd1472755c52bda1fd0f93f183ad641a75c2"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.157424 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" event={"ID":"8823e665-73c6-4f33-a6c2-18a8a750abb9","Type":"ContainerStarted","Data":"8ba8a5c7f0ae2fce25a93c55300f30cebd2f850c0ad1748424ed3a9fff071851"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.158122 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.168573 4935 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mmwcj container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.168658 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" podUID="8823e665-73c6-4f33-a6c2-18a8a750abb9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": dial tcp 10.217.0.17:8443: connect: connection refused" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.172062 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp" event={"ID":"888de074-122c-4990-8dc9-c1fa8b4361af","Type":"ContainerStarted","Data":"e09078bbf7438c4126f8ff03a7d0d26e2584e133c78000d2fc32b45b402d3a65"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.173689 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" event={"ID":"60032bf5-af40-4d89-a7e3-e2e8da6382a3","Type":"ContainerStarted","Data":"bb1c28a02bd55017871486b3e2d34f607c3476258757da1524eda93c754973a1"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.187081 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" podStartSLOduration=123.187067519 podStartE2EDuration="2m3.187067519s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:07.18521507 +0000 UTC m=+146.845055833" watchObservedRunningTime="2025-12-17 09:07:07.187067519 +0000 UTC m=+146.846908282" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.214042 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" event={"ID":"503b0e00-0b2a-4d33-a742-0dc3a2a73343","Type":"ContainerStarted","Data":"ef6601e9adb7352f8ab353efd52088244f0f0dfa9e651578beedb53c2bb0bbd6"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.230457 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" event={"ID":"49a58e6e-ca3c-438f-b468-06d2f3ed7050","Type":"ContainerStarted","Data":"75f377b9d6df7b6815ac98fdc01ac366345bac7cd66ce40f429438ea47bd513b"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.233116 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" event={"ID":"d86831b4-8c67-47a1-bc74-8db8e8399b26","Type":"ContainerStarted","Data":"fb7a2ea48395bd9ec2807a63a3cd07d607613b1cc57da3f7bb6220f9766eb2a8"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.233177 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" event={"ID":"d86831b4-8c67-47a1-bc74-8db8e8399b26","Type":"ContainerStarted","Data":"0e44052010399e19c69165bdb62cb1425801ba994a2e5f82ed30b960dc6fe2d4"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.259071 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" event={"ID":"36722ee1-e3e3-4533-902a-038a215f40fe","Type":"ContainerStarted","Data":"a8eae8be7c3c11db45432369ce495b6260b7775791b1cee1d1cf41c2515c3794"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.259903 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:07 crc kubenswrapper[4935]: E1217 09:07:07.261576 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.761547303 +0000 UTC m=+147.421388066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.304774 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" event={"ID":"84312d40-3400-410d-9ba1-952f8ffbd442","Type":"ContainerStarted","Data":"3fcc18b9cb0c8197fffa39ee9724067ddc4d1499f35f5a5cdae8f1faafee15e7"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.316017 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" event={"ID":"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d","Type":"ContainerStarted","Data":"dd6d35b57fd819c089ae1a9d5a682742fbcceda4b035cc725f929b85366cd357"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.322599 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" event={"ID":"50ad2e0b-8528-481e-af4f-f7343d62878a","Type":"ContainerStarted","Data":"2fd08308dc94565d45b0a3f522835b5fd094a752b6cdfd474f5520668f528589"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.322658 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" event={"ID":"50ad2e0b-8528-481e-af4f-f7343d62878a","Type":"ContainerStarted","Data":"167a30bb6f3eb60f60021d8e91a6e60b07857574abe347c431707cde369f9b0f"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.323688 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.324949 4935 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-k6mmw container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.324984 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" podUID="50ad2e0b-8528-481e-af4f-f7343d62878a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.336181 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" podStartSLOduration=123.33615702 podStartE2EDuration="2m3.33615702s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:07.335412681 +0000 UTC m=+146.995253444" watchObservedRunningTime="2025-12-17 09:07:07.33615702 +0000 UTC m=+146.995997803" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.343148 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:07 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:07 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:07 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.343224 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.354383 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" event={"ID":"3207f41f-7cf8-4e0f-80d2-639237ce8b3e","Type":"ContainerStarted","Data":"75a3cd6c1367fea2ea9dd1055fc185da7d9d79b6ccf5266cada87744f6e4b9b7"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.354429 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" event={"ID":"3207f41f-7cf8-4e0f-80d2-639237ce8b3e","Type":"ContainerStarted","Data":"9727935be0edca14690def20a28fb4f50258c85fda4761e15cd531d19424280f"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.366336 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:07 crc kubenswrapper[4935]: E1217 09:07:07.366879 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.866863714 +0000 UTC m=+147.526704477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.375774 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" podStartSLOduration=123.37575035 podStartE2EDuration="2m3.37575035s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:07.373666024 +0000 UTC m=+147.033506787" watchObservedRunningTime="2025-12-17 09:07:07.37575035 +0000 UTC m=+147.035591113" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.402566 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" event={"ID":"f337d441-0527-46d0-98f4-a9323a682482","Type":"ContainerStarted","Data":"96bdc3667af8f814f902f782f581db0bb51e1aedc6cda40ceffc10a9b96ff1fc"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.426578 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.426796 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" event={"ID":"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e","Type":"ContainerStarted","Data":"3b85fdf3cfd7ab926d9326949d4ed07be9eac62eb6f2804c55c717c3207d6414"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.426868 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" event={"ID":"89340f7a-cc56-44ef-8fbf-0e4efd9ce27e","Type":"ContainerStarted","Data":"926815ffbef8feb894efde84385cf53f2a306021e2c9496694b18c009c80c17b"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.456649 4935 generic.go:334] "Generic (PLEG): container finished" podID="4334562a-5ab5-4601-932e-4a765f2d11a8" containerID="2872a1d5b5e242a87b9d4c10c43b9b2abeeec372bc77610a0d7c91d9286e63d4" exitCode=0 Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.456739 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" event={"ID":"4334562a-5ab5-4601-932e-4a765f2d11a8","Type":"ContainerDied","Data":"2872a1d5b5e242a87b9d4c10c43b9b2abeeec372bc77610a0d7c91d9286e63d4"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.456773 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" event={"ID":"4334562a-5ab5-4601-932e-4a765f2d11a8","Type":"ContainerStarted","Data":"3e6d75e73871ca5e9459862d11b49952d1ae36ede73dedee73da33da8a16ac5c"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.474879 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:07 crc kubenswrapper[4935]: E1217 09:07:07.476400 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:07.976373917 +0000 UTC m=+147.636214680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.578002 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" event={"ID":"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524","Type":"ContainerStarted","Data":"d09d07a9758bef9937f7eb388df32ce29a4584d898d1ebdd13ccb587c7929f84"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.580797 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:07 crc kubenswrapper[4935]: E1217 09:07:07.582539 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:08.08252431 +0000 UTC m=+147.742365073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.594233 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-qnwvp" podStartSLOduration=123.59420759 podStartE2EDuration="2m3.59420759s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:07.57950766 +0000 UTC m=+147.239348423" watchObservedRunningTime="2025-12-17 09:07:07.59420759 +0000 UTC m=+147.254048353" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.651714 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" event={"ID":"e69422fb-c984-49c7-98db-a055b29fa457","Type":"ContainerStarted","Data":"4b5aaa9872187552cf77ca73ef9ab93e7d7ee5c4f483c6020858947dd5516467"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.652287 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" event={"ID":"e69422fb-c984-49c7-98db-a055b29fa457","Type":"ContainerStarted","Data":"ccdbe1f3d0c5dbe2016d4df844523789424670e98644bc6199cd3e1385f8cd6d"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.683742 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:07 crc kubenswrapper[4935]: E1217 09:07:07.684458 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:08.184436801 +0000 UTC m=+147.844277564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.700419 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" event={"ID":"ac56f58a-57ad-4d33-931d-2c00504e09fa","Type":"ContainerStarted","Data":"a25ff32776544291f5be2fcfdde92e7e27d74467674de87e413357e63960837a"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.741258 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" event={"ID":"8a03e09d-6bb6-4241-b1f7-24f864b05640","Type":"ContainerStarted","Data":"ec2ba3c3c7987cda72393c4441d35e51267d5aa703847640994aa5c8b173bb10"} Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.761573 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tfngw" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.772415 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" podStartSLOduration=123.772388512 podStartE2EDuration="2m3.772388512s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:07.772017913 +0000 UTC m=+147.431858676" watchObservedRunningTime="2025-12-17 09:07:07.772388512 +0000 UTC m=+147.432229275" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.772940 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" podStartSLOduration=123.772933186 podStartE2EDuration="2m3.772933186s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:07.657701773 +0000 UTC m=+147.317542536" watchObservedRunningTime="2025-12-17 09:07:07.772933186 +0000 UTC m=+147.432773949" Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.789796 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:07 crc kubenswrapper[4935]: E1217 09:07:07.790358 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:08.290339348 +0000 UTC m=+147.950180121 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.892759 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:07 crc kubenswrapper[4935]: E1217 09:07:07.894497 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:08.394480048 +0000 UTC m=+148.054320811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:07 crc kubenswrapper[4935]: I1217 09:07:07.996802 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:07 crc kubenswrapper[4935]: E1217 09:07:07.997197 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:08.49717967 +0000 UTC m=+148.157020433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.102911 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:08 crc kubenswrapper[4935]: E1217 09:07:08.103390 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:08.603368734 +0000 UTC m=+148.263209497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.205189 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:08 crc kubenswrapper[4935]: E1217 09:07:08.205635 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:08.705623255 +0000 UTC m=+148.365464018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.308046 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:08 crc kubenswrapper[4935]: E1217 09:07:08.309122 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:08.809091577 +0000 UTC m=+148.468932340 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.348575 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:08 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:08 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:08 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.348656 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.410342 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:08 crc kubenswrapper[4935]: E1217 09:07:08.410897 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:08.910875535 +0000 UTC m=+148.570716458 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.511865 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:08 crc kubenswrapper[4935]: E1217 09:07:08.512312 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.012290003 +0000 UTC m=+148.672130766 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.613476 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:08 crc kubenswrapper[4935]: E1217 09:07:08.614118 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.114097341 +0000 UTC m=+148.773938104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.715216 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:08 crc kubenswrapper[4935]: E1217 09:07:08.715603 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.215578111 +0000 UTC m=+148.875418874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.772844 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" event={"ID":"503b0e00-0b2a-4d33-a742-0dc3a2a73343","Type":"ContainerStarted","Data":"6b63300ce508618ddb50bdf2616536eb60032684f88e91216f4fca220cc5c633"} Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.822433 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:08 crc kubenswrapper[4935]: E1217 09:07:08.822861 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.322842304 +0000 UTC m=+148.982683067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.823018 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" event={"ID":"d86831b4-8c67-47a1-bc74-8db8e8399b26","Type":"ContainerStarted","Data":"e9dd12888722bf8fab0ac817eb0fd988021458f0c795e8a33c3b8c8fd50f8ebe"} Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.824097 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.849238 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d7j8g" event={"ID":"84312d40-3400-410d-9ba1-952f8ffbd442","Type":"ContainerStarted","Data":"4b4d58ed3c3281143cd69ec68c9decab409bb8b9a7a4a5bb5953a430613bb6cf"} Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.907141 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" podStartSLOduration=124.907108937 podStartE2EDuration="2m4.907108937s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:08.890178018 +0000 UTC m=+148.550018781" watchObservedRunningTime="2025-12-17 09:07:08.907108937 +0000 UTC m=+148.566949700" Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.912109 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" event={"ID":"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d","Type":"ContainerStarted","Data":"1fc287aeb97905df13f432dd9e4c2004f5ee67266279312cb14ad7b50649524c"} Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.925480 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:08 crc kubenswrapper[4935]: E1217 09:07:08.943558 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.443532512 +0000 UTC m=+149.103373275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.966647 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" event={"ID":"4334562a-5ab5-4601-932e-4a765f2d11a8","Type":"ContainerStarted","Data":"e1b8fa7855b6bf3cf13c344811d7353448bbd10e088941c770e074582ea942fa"} Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.966776 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.967135 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" podStartSLOduration=124.967116288 podStartE2EDuration="2m4.967116288s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:08.965057914 +0000 UTC m=+148.624898697" watchObservedRunningTime="2025-12-17 09:07:08.967116288 +0000 UTC m=+148.626957051" Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.978057 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" event={"ID":"b44f7e9d-3f08-48de-a465-056daf8a4549","Type":"ContainerStarted","Data":"53d4f3e95e16d45dce543b8f8e60f94a87501e2a82a3b1168cb6ef1c202bcfd4"} Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.990386 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" podStartSLOduration=125.990366034 podStartE2EDuration="2m5.990366034s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:08.989829249 +0000 UTC m=+148.649670012" watchObservedRunningTime="2025-12-17 09:07:08.990366034 +0000 UTC m=+148.650206787" Dec 17 09:07:08 crc kubenswrapper[4935]: I1217 09:07:08.991384 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" event={"ID":"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524","Type":"ContainerStarted","Data":"9128e2ca047e1cbe724ab69b1ff38ad86d8c64e89a1d459ad6cb654321db2371"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.001969 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp" event={"ID":"888de074-122c-4990-8dc9-c1fa8b4361af","Type":"ContainerStarted","Data":"a3ea389434c4df5974769e8062edfaded8de5bd4106ece36151d254f7a6e6424"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.002031 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp" event={"ID":"888de074-122c-4990-8dc9-c1fa8b4361af","Type":"ContainerStarted","Data":"660e5ebbd6693feda76d55e56083ca5e4ddd0918b9527fd7eccf2b1ac508e938"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.030667 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.033476 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.533461517 +0000 UTC m=+149.193302270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.049546 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-nmgrc" event={"ID":"830ca689-7f10-4899-b1fc-1d88feecf243","Type":"ContainerStarted","Data":"14ba9dd64b2b60965e4df4d34c57bd829200a17b1a06d8723c895c75738bd65d"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.053141 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nc7cp" podStartSLOduration=125.053118778 podStartE2EDuration="2m5.053118778s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.049085911 +0000 UTC m=+148.708926674" watchObservedRunningTime="2025-12-17 09:07:09.053118778 +0000 UTC m=+148.712959541" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.067635 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-79ntp" event={"ID":"0407e29c-77f3-481d-8642-91af970b9ef7","Type":"ContainerStarted","Data":"8e0501094e4775433ae325df4913d4503740cb4d3a95aad1bb3116438d6c666d"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.076777 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" event={"ID":"36722ee1-e3e3-4533-902a-038a215f40fe","Type":"ContainerStarted","Data":"d29dbf652368a90c505171e27a01fc36758f568a1e1339b31c5683a2d11ba5f0"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.083668 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" event={"ID":"8befec57-6330-4505-bdf6-f9fe6667a8bf","Type":"ContainerStarted","Data":"c49e44abf521d5a2a6d4b02ab86c263350c91a2259505e91cf5c802f8249f31c"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.092308 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" event={"ID":"0eba7d82-1203-49c2-bc66-7c77d2298a0d","Type":"ContainerStarted","Data":"de8090d023881088f7d5f79dd9a5099400459ea4f6be2691c13bed9590a3297c"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.111452 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" event={"ID":"49a58e6e-ca3c-438f-b468-06d2f3ed7050","Type":"ContainerStarted","Data":"d33e57125a5d0e9ca50f0039aa90e9dcb331a378807ebe13d391b1b53196c409"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.114328 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8nsh2" podStartSLOduration=125.114313869 podStartE2EDuration="2m5.114313869s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.113682573 +0000 UTC m=+148.773523336" watchObservedRunningTime="2025-12-17 09:07:09.114313869 +0000 UTC m=+148.774154632" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.138200 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.140171 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.640155844 +0000 UTC m=+149.299996607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.172453 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-jwnkk" podStartSLOduration=125.17243041 podStartE2EDuration="2m5.17243041s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.170091157 +0000 UTC m=+148.829931920" watchObservedRunningTime="2025-12-17 09:07:09.17243041 +0000 UTC m=+148.832271163" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.175579 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" event={"ID":"6a47edc9-43df-40ba-8b92-80205500df3c","Type":"ContainerStarted","Data":"31c44d896aff342eef536ccd2d7a7f2e8243b2cd31e525f52a54b2c56a0adfdc"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.207545 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" event={"ID":"fb17d1b3-11c8-491b-bf10-35bd9142ad4c","Type":"ContainerStarted","Data":"ac326fb764f9e06ad8285fcc0bc5efd62d1be62d5cbe68bd785bc1029bd67819"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.242793 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.244623 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.744602732 +0000 UTC m=+149.404443495 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.268135 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wmgg7" podStartSLOduration=125.268112115 podStartE2EDuration="2m5.268112115s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.217071253 +0000 UTC m=+148.876912026" watchObservedRunningTime="2025-12-17 09:07:09.268112115 +0000 UTC m=+148.927952878" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.291096 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" event={"ID":"cbf26e5b-1ce7-4dfd-82ce-ea56cf9f1a7b","Type":"ContainerStarted","Data":"da3d6f3389d7a822e70c6b99f8d8403edf1fb9a8186ab9ab302c9efa430bc466"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.318885 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bcz66" podStartSLOduration=126.31884932 podStartE2EDuration="2m6.31884932s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.317454903 +0000 UTC m=+148.977295666" watchObservedRunningTime="2025-12-17 09:07:09.31884932 +0000 UTC m=+148.978690103" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.319138 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" podStartSLOduration=125.319129148 podStartE2EDuration="2m5.319129148s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.281094799 +0000 UTC m=+148.940935562" watchObservedRunningTime="2025-12-17 09:07:09.319129148 +0000 UTC m=+148.978969911" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.344625 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.346186 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.846164964 +0000 UTC m=+149.506005727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.347211 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:09 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:09 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:09 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.347265 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.351460 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-k7vbl" event={"ID":"8a03e09d-6bb6-4241-b1f7-24f864b05640","Type":"ContainerStarted","Data":"fc588a2cb51141ccc4a25ed7e6444a1508b0e3f04a29ccb9cea8628f26b11e3e"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.390035 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tdxp5" event={"ID":"2f54f534-8232-4771-a97b-5ce4f29b8a3d","Type":"ContainerStarted","Data":"fcf63e408545e2c63c1d263dad63e0ca752c42584e0cd57b0c80126529b2e241"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.390914 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.400887 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" event={"ID":"60032bf5-af40-4d89-a7e3-e2e8da6382a3","Type":"ContainerStarted","Data":"eed1cf006e81f4d1393aedf988cc075b6f7c13e80cbab04480af041094c2aa64"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.401906 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.403409 4935 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5j4xq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.403460 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" podUID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.414679 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" event={"ID":"3207f41f-7cf8-4e0f-80d2-639237ce8b3e","Type":"ContainerStarted","Data":"b005b210a78d96ffc6f4573e66378912b508346b897e310becf46b4350dd607c"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.417643 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" event={"ID":"ac56f58a-57ad-4d33-931d-2c00504e09fa","Type":"ContainerStarted","Data":"3fc83a81290dcd36c2b77a63d17b40e00c2664f16873b093ef008f828306fb48"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.430979 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" event={"ID":"e69422fb-c984-49c7-98db-a055b29fa457","Type":"ContainerStarted","Data":"c0138712284bf9a988bf49007a928c461c8850a960c16f5984f1eeadbd30a00a"} Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.432553 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-hq8r6" podStartSLOduration=126.432519983 podStartE2EDuration="2m6.432519983s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.392933494 +0000 UTC m=+149.052774277" watchObservedRunningTime="2025-12-17 09:07:09.432519983 +0000 UTC m=+149.092360746" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.433938 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tdxp5" podStartSLOduration=8.433932181 podStartE2EDuration="8.433932181s" podCreationTimestamp="2025-12-17 09:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.433671093 +0000 UTC m=+149.093511866" watchObservedRunningTime="2025-12-17 09:07:09.433932181 +0000 UTC m=+149.093772944" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.445920 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.446527 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.446813 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-k6mmw" Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.446908 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:09.946893274 +0000 UTC m=+149.606734037 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.463456 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-bc5v9" podStartSLOduration=126.463428603 podStartE2EDuration="2m6.463428603s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.462718803 +0000 UTC m=+149.122559576" watchObservedRunningTime="2025-12-17 09:07:09.463428603 +0000 UTC m=+149.123269366" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.540952 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-z5h6w" podStartSLOduration=125.540928956 podStartE2EDuration="2m5.540928956s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.537708441 +0000 UTC m=+149.197549214" watchObservedRunningTime="2025-12-17 09:07:09.540928956 +0000 UTC m=+149.200769709" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.549027 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.549567 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.049540914 +0000 UTC m=+149.709381677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.550168 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.550678 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.050660004 +0000 UTC m=+149.710500767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.635150 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-8gmgm" podStartSLOduration=125.635107212 podStartE2EDuration="2m5.635107212s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.577320841 +0000 UTC m=+149.237161604" watchObservedRunningTime="2025-12-17 09:07:09.635107212 +0000 UTC m=+149.294947975" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.653239 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.653720 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.153698356 +0000 UTC m=+149.813539109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.754582 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.755214 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.255201915 +0000 UTC m=+149.915042678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.813115 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" podStartSLOduration=125.81309721 podStartE2EDuration="2m5.81309721s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.705893799 +0000 UTC m=+149.365734562" watchObservedRunningTime="2025-12-17 09:07:09.81309721 +0000 UTC m=+149.472937973" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.856176 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.856636 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.356620124 +0000 UTC m=+150.016460887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.916822 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-mqg4s" podStartSLOduration=126.916800488 podStartE2EDuration="2m6.916800488s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:09.814071226 +0000 UTC m=+149.473911989" watchObservedRunningTime="2025-12-17 09:07:09.916800488 +0000 UTC m=+149.576641251" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.918726 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6b6bk"] Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.919734 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.927128 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.954317 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6b6bk"] Dec 17 09:07:09 crc kubenswrapper[4935]: I1217 09:07:09.957690 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:09 crc kubenswrapper[4935]: E1217 09:07:09.958056 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.458041712 +0000 UTC m=+150.117882475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.060091 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.060334 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.560300572 +0000 UTC m=+150.220141335 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.060420 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.060456 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n79k6\" (UniqueName: \"kubernetes.io/projected/157182ad-4c05-4142-9659-4d1309dceec9-kube-api-access-n79k6\") pod \"certified-operators-6b6bk\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.060638 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-utilities\") pod \"certified-operators-6b6bk\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.060713 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-catalog-content\") pod \"certified-operators-6b6bk\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.060876 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.560867977 +0000 UTC m=+150.220708740 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.134659 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wjwnt"] Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.135832 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.154922 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.161435 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.161623 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.661590956 +0000 UTC m=+150.321431719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.162187 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-utilities\") pod \"certified-operators-6b6bk\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.162355 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.162519 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.162663 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-catalog-content\") pod \"certified-operators-6b6bk\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.162795 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.162934 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.163046 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n79k6\" (UniqueName: \"kubernetes.io/projected/157182ad-4c05-4142-9659-4d1309dceec9-kube-api-access-n79k6\") pod \"certified-operators-6b6bk\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.163172 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.163183 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.164148 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-utilities\") pod \"certified-operators-6b6bk\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.164638 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.664615206 +0000 UTC m=+150.324455969 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.164800 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-catalog-content\") pod \"certified-operators-6b6bk\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.172231 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.172235 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.173312 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.180194 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.218620 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n79k6\" (UniqueName: \"kubernetes.io/projected/157182ad-4c05-4142-9659-4d1309dceec9-kube-api-access-n79k6\") pod \"certified-operators-6b6bk\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.222826 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wjwnt"] Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.242242 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.265248 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.265465 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.765427378 +0000 UTC m=+150.425268141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.266087 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-utilities\") pod \"community-operators-wjwnt\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.266150 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-catalog-content\") pod \"community-operators-wjwnt\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.266204 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.266307 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjv6\" (UniqueName: \"kubernetes.io/projected/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-kube-api-access-bjjv6\") pod \"community-operators-wjwnt\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.266849 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.766807175 +0000 UTC m=+150.426647948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.316957 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7zz8h"] Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.322168 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.340462 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:10 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:10 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:10 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.340528 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.354046 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zz8h"] Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.367858 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.368226 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-utilities\") pod \"community-operators-wjwnt\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.368266 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-catalog-content\") pod \"community-operators-wjwnt\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.368351 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjjv6\" (UniqueName: \"kubernetes.io/projected/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-kube-api-access-bjjv6\") pod \"community-operators-wjwnt\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.368712 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.868698076 +0000 UTC m=+150.528538839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.369101 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-utilities\") pod \"community-operators-wjwnt\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.369336 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-catalog-content\") pod \"community-operators-wjwnt\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.449640 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.451478 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjjv6\" (UniqueName: \"kubernetes.io/projected/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-kube-api-access-bjjv6\") pod \"community-operators-wjwnt\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.459681 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.468593 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.470161 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-utilities\") pod \"certified-operators-7zz8h\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.470247 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.470304 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-catalog-content\") pod \"certified-operators-7zz8h\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.470321 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22k4b\" (UniqueName: \"kubernetes.io/projected/78732594-f947-416e-a67d-2c1fd2ae310f-kube-api-access-22k4b\") pod \"certified-operators-7zz8h\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.470656 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:10.970644468 +0000 UTC m=+150.630485231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.540049 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vcgs6"] Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.546745 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.557714 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcgs6"] Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.572002 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.572289 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-utilities\") pod \"certified-operators-7zz8h\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.572387 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22k4b\" (UniqueName: \"kubernetes.io/projected/78732594-f947-416e-a67d-2c1fd2ae310f-kube-api-access-22k4b\") pod \"certified-operators-7zz8h\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.572415 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-catalog-content\") pod \"certified-operators-7zz8h\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.572902 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-catalog-content\") pod \"certified-operators-7zz8h\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.572984 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:11.07296445 +0000 UTC m=+150.732805213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.573203 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-utilities\") pod \"certified-operators-7zz8h\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.585259 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gx689" event={"ID":"05f18f0f-eb23-4bc8-ab78-9ff1c84d825d","Type":"ContainerStarted","Data":"423df178a34aeb37d5ab3c141aad31a4fc67be1e0843e09fc32b5594c6ca08a5"} Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.603011 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22k4b\" (UniqueName: \"kubernetes.io/projected/78732594-f947-416e-a67d-2c1fd2ae310f-kube-api-access-22k4b\") pod \"certified-operators-7zz8h\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.628580 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tdxp5" event={"ID":"2f54f534-8232-4771-a97b-5ce4f29b8a3d","Type":"ContainerStarted","Data":"45f821a98c2290754570d8279e8ba30b02d55c4313dc480efc1d483204af01f5"} Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.650917 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" event={"ID":"b44f7e9d-3f08-48de-a465-056daf8a4549","Type":"ContainerStarted","Data":"c34376c1c0651199fa8275ce426ba91a02aa0f36ff6183f87a310de5371ca5c1"} Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.653873 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.682006 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-catalog-content\") pod \"community-operators-vcgs6\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.682102 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6jjr\" (UniqueName: \"kubernetes.io/projected/1c7a0058-87b0-440a-b80a-ebb69f4d9370-kube-api-access-q6jjr\") pod \"community-operators-vcgs6\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.682180 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.682225 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-utilities\") pod \"community-operators-vcgs6\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.683807 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:11.183785387 +0000 UTC m=+150.843626150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.713785 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" event={"ID":"503b0e00-0b2a-4d33-a742-0dc3a2a73343","Type":"ContainerStarted","Data":"c148162700e02d29c0fbb448f2c8745a994cf69bf14c0faccf8390929f7d1ff1"} Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.721462 4935 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-5j4xq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.721567 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" podUID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.724571 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" podStartSLOduration=127.724559897 podStartE2EDuration="2m7.724559897s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:10.722558925 +0000 UTC m=+150.382399688" watchObservedRunningTime="2025-12-17 09:07:10.724559897 +0000 UTC m=+150.384400660" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.783242 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.783577 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-catalog-content\") pod \"community-operators-vcgs6\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.783626 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6jjr\" (UniqueName: \"kubernetes.io/projected/1c7a0058-87b0-440a-b80a-ebb69f4d9370-kube-api-access-q6jjr\") pod \"community-operators-vcgs6\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.783731 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-utilities\") pod \"community-operators-vcgs6\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.785787 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:11.28575884 +0000 UTC m=+150.945599743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.786506 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-catalog-content\") pod \"community-operators-vcgs6\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.787319 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-utilities\") pod \"community-operators-vcgs6\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.792238 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-bvl78" podStartSLOduration=126.792216531 podStartE2EDuration="2m6.792216531s" podCreationTimestamp="2025-12-17 09:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:10.79181989 +0000 UTC m=+150.451660653" watchObservedRunningTime="2025-12-17 09:07:10.792216531 +0000 UTC m=+150.452057294" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.856940 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6jjr\" (UniqueName: \"kubernetes.io/projected/1c7a0058-87b0-440a-b80a-ebb69f4d9370-kube-api-access-q6jjr\") pod \"community-operators-vcgs6\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.891975 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:10 crc kubenswrapper[4935]: E1217 09:07:10.908417 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:11.408399271 +0000 UTC m=+151.068240034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:10 crc kubenswrapper[4935]: I1217 09:07:10.949310 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.001882 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:11 crc kubenswrapper[4935]: E1217 09:07:11.002698 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:11.502678839 +0000 UTC m=+151.162519602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.109185 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:11 crc kubenswrapper[4935]: E1217 09:07:11.109618 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:11.609603893 +0000 UTC m=+151.269444656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.212799 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:11 crc kubenswrapper[4935]: E1217 09:07:11.213301 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-17 09:07:11.713246649 +0000 UTC m=+151.373087422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.293069 4935 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.317156 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:11 crc kubenswrapper[4935]: E1217 09:07:11.317562 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-17 09:07:11.817548884 +0000 UTC m=+151.477389637 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-g8v79" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.342183 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:11 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:11 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:11 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.342512 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.364418 4935 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-17T09:07:11.293480917Z","Handler":null,"Name":""} Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.396715 4935 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.396767 4935 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.420807 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.440508 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6b6bk"] Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.441002 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.523364 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.540167 4935 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.540225 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.675189 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.676699 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.688398 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.688487 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.689011 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.730843 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b28d2313-0c54-4d14-b951-9b74d5f49819-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b28d2313-0c54-4d14-b951-9b74d5f49819\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.730916 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28d2313-0c54-4d14-b951-9b74d5f49819-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b28d2313-0c54-4d14-b951-9b74d5f49819\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.741616 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d4a937c91a922612781c07d4f74390817f0916c6fa58fb4f856f3a2e3e523c69"} Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.742083 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9e2dcdeb601a7676faa5941b825456a40b3dc6937721d3da40c31a3a4a0aa47a"} Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.743369 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.753254 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-79ntp" event={"ID":"0407e29c-77f3-481d-8642-91af970b9ef7","Type":"ContainerStarted","Data":"00f7a83ef5cc682415c199399b9a35a1a18703f33a0c1936f178f36767aba2dc"} Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.768540 4935 generic.go:334] "Generic (PLEG): container finished" podID="78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524" containerID="9128e2ca047e1cbe724ab69b1ff38ad86d8c64e89a1d459ad6cb654321db2371" exitCode=0 Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.776326 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" event={"ID":"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524","Type":"ContainerDied","Data":"9128e2ca047e1cbe724ab69b1ff38ad86d8c64e89a1d459ad6cb654321db2371"} Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.834555 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b28d2313-0c54-4d14-b951-9b74d5f49819-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b28d2313-0c54-4d14-b951-9b74d5f49819\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.834708 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28d2313-0c54-4d14-b951-9b74d5f49819-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b28d2313-0c54-4d14-b951-9b74d5f49819\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.834982 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b28d2313-0c54-4d14-b951-9b74d5f49819-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b28d2313-0c54-4d14-b951-9b74d5f49819\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.854577 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-g8v79\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.858310 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b6bk" event={"ID":"157182ad-4c05-4142-9659-4d1309dceec9","Type":"ContainerStarted","Data":"1cebe785aa00c37c774a0c2c2eb46df23217d362171dcee9c567ddcca40208f2"} Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.862422 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7zz8h"] Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.863199 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28d2313-0c54-4d14-b951-9b74d5f49819-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b28d2313-0c54-4d14-b951-9b74d5f49819\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.871758 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:07:11 crc kubenswrapper[4935]: I1217 09:07:11.976832 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vcgs6"] Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.021659 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.050323 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wjwnt"] Dec 17 09:07:12 crc kubenswrapper[4935]: W1217 09:07:12.080594 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-b024d164923887c95ebdb0a05f74e3f6bc7e75d595b78ec6c6c1ef69410a5eed WatchSource:0}: Error finding container b024d164923887c95ebdb0a05f74e3f6bc7e75d595b78ec6c6c1ef69410a5eed: Status 404 returned error can't find the container with id b024d164923887c95ebdb0a05f74e3f6bc7e75d595b78ec6c6c1ef69410a5eed Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.122640 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-67fts"] Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.123912 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.124867 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.129697 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67fts"] Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.132369 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.256258 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwrtw\" (UniqueName: \"kubernetes.io/projected/b246facd-9d67-4a8c-9b5f-e160ae46c462-kube-api-access-lwrtw\") pod \"redhat-marketplace-67fts\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.256702 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-catalog-content\") pod \"redhat-marketplace-67fts\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.256795 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-utilities\") pod \"redhat-marketplace-67fts\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.340135 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:12 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:12 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:12 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.340191 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.357727 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-utilities\") pod \"redhat-marketplace-67fts\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.357801 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwrtw\" (UniqueName: \"kubernetes.io/projected/b246facd-9d67-4a8c-9b5f-e160ae46c462-kube-api-access-lwrtw\") pod \"redhat-marketplace-67fts\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.357839 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-catalog-content\") pod \"redhat-marketplace-67fts\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.358304 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-catalog-content\") pod \"redhat-marketplace-67fts\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.358753 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-utilities\") pod \"redhat-marketplace-67fts\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.408870 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwrtw\" (UniqueName: \"kubernetes.io/projected/b246facd-9d67-4a8c-9b5f-e160ae46c462-kube-api-access-lwrtw\") pod \"redhat-marketplace-67fts\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.451441 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.499398 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kchlg"] Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.501487 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.516592 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kchlg"] Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.560240 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-utilities\") pod \"redhat-marketplace-kchlg\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.560382 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxnrj\" (UniqueName: \"kubernetes.io/projected/72303ff8-4dec-4270-b0e6-6414b0cf0c63-kube-api-access-bxnrj\") pod \"redhat-marketplace-kchlg\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.560416 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-catalog-content\") pod \"redhat-marketplace-kchlg\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.583158 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g8v79"] Dec 17 09:07:12 crc kubenswrapper[4935]: W1217 09:07:12.587010 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aea1606_ff6f_4325_9f92_c83e2c5079c0.slice/crio-30cb1279a3fee019e342ba7482e208590983ff2d677c74b287d186f6d76ff0cb WatchSource:0}: Error finding container 30cb1279a3fee019e342ba7482e208590983ff2d677c74b287d186f6d76ff0cb: Status 404 returned error can't find the container with id 30cb1279a3fee019e342ba7482e208590983ff2d677c74b287d186f6d76ff0cb Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.651857 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5dtrh" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.661714 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-utilities\") pod \"redhat-marketplace-kchlg\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.661792 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxnrj\" (UniqueName: \"kubernetes.io/projected/72303ff8-4dec-4270-b0e6-6414b0cf0c63-kube-api-access-bxnrj\") pod \"redhat-marketplace-kchlg\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.661828 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-catalog-content\") pod \"redhat-marketplace-kchlg\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.663116 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-utilities\") pod \"redhat-marketplace-kchlg\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.663129 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-catalog-content\") pod \"redhat-marketplace-kchlg\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.685598 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxnrj\" (UniqueName: \"kubernetes.io/projected/72303ff8-4dec-4270-b0e6-6414b0cf0c63-kube-api-access-bxnrj\") pod \"redhat-marketplace-kchlg\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.735130 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67fts"] Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.837579 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.866600 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-79ntp" event={"ID":"0407e29c-77f3-481d-8642-91af970b9ef7","Type":"ContainerStarted","Data":"e40ed4895bb9a9b3b5ff0b60e27f1f30496dc921f7843f0def125c184b5e21d0"} Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.868304 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgs6" event={"ID":"1c7a0058-87b0-440a-b80a-ebb69f4d9370","Type":"ContainerStarted","Data":"23ce43dda980e11cac6e1e7111bd5b72d85fcdb6ab8712ef21df24bae36c766b"} Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.869736 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b024d164923887c95ebdb0a05f74e3f6bc7e75d595b78ec6c6c1ef69410a5eed"} Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.872323 4935 generic.go:334] "Generic (PLEG): container finished" podID="78732594-f947-416e-a67d-2c1fd2ae310f" containerID="1df4e7bd26c2e457f798727ff3b7844e40e5021e168192d88439ba35d9dac9d5" exitCode=0 Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.872427 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zz8h" event={"ID":"78732594-f947-416e-a67d-2c1fd2ae310f","Type":"ContainerDied","Data":"1df4e7bd26c2e457f798727ff3b7844e40e5021e168192d88439ba35d9dac9d5"} Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.872462 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zz8h" event={"ID":"78732594-f947-416e-a67d-2c1fd2ae310f","Type":"ContainerStarted","Data":"f78666ae4b64ece1b3fa87db3b7010876bb5d2809d07032705f4c12b4fa0ed0c"} Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.874639 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.875945 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"d2bbdc731c6d322fe02c6352ce4cdef255d7b3fef583cf4cf3564dc3a0824fc9"} Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.876001 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1f0d6a50259cee63cc5223daa7449b4a2f4617a3f4a2e06e3badd559c860f19c"} Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.878229 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" event={"ID":"2aea1606-ff6f-4325-9f92-c83e2c5079c0","Type":"ContainerStarted","Data":"30cb1279a3fee019e342ba7482e208590983ff2d677c74b287d186f6d76ff0cb"} Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.880067 4935 generic.go:334] "Generic (PLEG): container finished" podID="157182ad-4c05-4142-9659-4d1309dceec9" containerID="248b15934de31bbfd86db5d91efd0f8cfe0d1b3d0ae2a4839476ed0cc45c44aa" exitCode=0 Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.880152 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b6bk" event={"ID":"157182ad-4c05-4142-9659-4d1309dceec9","Type":"ContainerDied","Data":"248b15934de31bbfd86db5d91efd0f8cfe0d1b3d0ae2a4839476ed0cc45c44aa"} Dec 17 09:07:12 crc kubenswrapper[4935]: I1217 09:07:12.884209 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjwnt" event={"ID":"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7","Type":"ContainerStarted","Data":"fa1612f497e62c5abe138ffd12f070806981fa57910517eb62f7111784de6c48"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.094538 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hl2vl"] Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.095764 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.101689 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.110552 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hl2vl"] Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.130648 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.136965 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.166239 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.170143 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8dss\" (UniqueName: \"kubernetes.io/projected/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-kube-api-access-v8dss\") pod \"redhat-operators-hl2vl\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.170354 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-catalog-content\") pod \"redhat-operators-hl2vl\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.170481 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-utilities\") pod \"redhat-operators-hl2vl\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: W1217 09:07:13.173931 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb246facd_9d67_4a8c_9b5f_e160ae46c462.slice/crio-e9cc185d4809f7f5b42737ba2cc8b38a451eba9e8220174d3f46e66940f40959 WatchSource:0}: Error finding container e9cc185d4809f7f5b42737ba2cc8b38a451eba9e8220174d3f46e66940f40959: Status 404 returned error can't find the container with id e9cc185d4809f7f5b42737ba2cc8b38a451eba9e8220174d3f46e66940f40959 Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.178071 4935 patch_prober.go:28] interesting pod/downloads-7954f5f757-29mfs container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.178112 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-29mfs" podUID="618adccc-479a-43b8-a44f-eb62ce26108a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.178439 4935 patch_prober.go:28] interesting pod/downloads-7954f5f757-29mfs container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" start-of-body= Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.178462 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-29mfs" podUID="618adccc-479a-43b8-a44f-eb62ce26108a" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.13:8080/\": dial tcp 10.217.0.13:8080: connect: connection refused" Dec 17 09:07:13 crc kubenswrapper[4935]: W1217 09:07:13.194686 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb28d2313_0c54_4d14_b951_9b74d5f49819.slice/crio-bceed74e5c3ddeb5d8e1cb2a30f292f0900698d845438948ee1e748d60809d50 WatchSource:0}: Error finding container bceed74e5c3ddeb5d8e1cb2a30f292f0900698d845438948ee1e748d60809d50: Status 404 returned error can't find the container with id bceed74e5c3ddeb5d8e1cb2a30f292f0900698d845438948ee1e748d60809d50 Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.277197 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq47v\" (UniqueName: \"kubernetes.io/projected/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-kube-api-access-qq47v\") pod \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.277440 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-config-volume\") pod \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.277500 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-secret-volume\") pod \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\" (UID: \"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524\") " Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.277842 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8dss\" (UniqueName: \"kubernetes.io/projected/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-kube-api-access-v8dss\") pod \"redhat-operators-hl2vl\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.278053 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-catalog-content\") pod \"redhat-operators-hl2vl\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.278144 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-config-volume" (OuterVolumeSpecName: "config-volume") pod "78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524" (UID: "78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.278259 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-utilities\") pod \"redhat-operators-hl2vl\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.279249 4935 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-config-volume\") on node \"crc\" DevicePath \"\"" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.279690 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-utilities\") pod \"redhat-operators-hl2vl\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.279789 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-catalog-content\") pod \"redhat-operators-hl2vl\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.284938 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-kube-api-access-qq47v" (OuterVolumeSpecName: "kube-api-access-qq47v") pod "78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524" (UID: "78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524"). InnerVolumeSpecName "kube-api-access-qq47v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.285675 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kchlg"] Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.286302 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524" (UID: "78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.298729 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8dss\" (UniqueName: \"kubernetes.io/projected/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-kube-api-access-v8dss\") pod \"redhat-operators-hl2vl\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.339063 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:13 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:13 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:13 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.339180 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.381167 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qq47v\" (UniqueName: \"kubernetes.io/projected/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-kube-api-access-qq47v\") on node \"crc\" DevicePath \"\"" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.381248 4935 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.413314 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.495087 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8h5r5"] Dec 17 09:07:13 crc kubenswrapper[4935]: E1217 09:07:13.495347 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524" containerName="collect-profiles" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.495361 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524" containerName="collect-profiles" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.495509 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524" containerName="collect-profiles" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.496325 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.506152 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8h5r5"] Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.553457 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.554252 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.559818 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.559854 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.562734 4935 patch_prober.go:28] interesting pod/console-f9d7485db-nw6k6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.562778 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-nw6k6" podUID="a074b884-bf31-47dc-9257-41a7d4dda13e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.568110 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.568214 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.568229 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.578781 4935 patch_prober.go:28] interesting pod/apiserver-76f77b778f-ckj77 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]log ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]etcd ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/generic-apiserver-start-informers ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/max-in-flight-filter ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 17 09:07:13 crc kubenswrapper[4935]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/project.openshift.io-projectcache ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/openshift.io-startinformers ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 17 09:07:13 crc kubenswrapper[4935]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 17 09:07:13 crc kubenswrapper[4935]: livez check failed Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.578926 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" podUID="b44f7e9d-3f08-48de-a465-056daf8a4549" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.586622 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch4ld\" (UniqueName: \"kubernetes.io/projected/cb608d96-a065-48d5-b74f-dc166ba31c08-kube-api-access-ch4ld\") pod \"redhat-operators-8h5r5\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.586805 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-utilities\") pod \"redhat-operators-8h5r5\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.586855 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-catalog-content\") pod \"redhat-operators-8h5r5\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.688621 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch4ld\" (UniqueName: \"kubernetes.io/projected/cb608d96-a065-48d5-b74f-dc166ba31c08-kube-api-access-ch4ld\") pod \"redhat-operators-8h5r5\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.688922 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-utilities\") pod \"redhat-operators-8h5r5\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.688948 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-catalog-content\") pod \"redhat-operators-8h5r5\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.689398 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-catalog-content\") pod \"redhat-operators-8h5r5\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.690566 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-utilities\") pod \"redhat-operators-8h5r5\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.702544 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hl2vl"] Dec 17 09:07:13 crc kubenswrapper[4935]: W1217 09:07:13.709155 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d81a2b6_ac3a_4c8b_8348_e471e5e5a932.slice/crio-700f53af00793c02668e21c994925d39672334100797c98519a148e7b1e73a06 WatchSource:0}: Error finding container 700f53af00793c02668e21c994925d39672334100797c98519a148e7b1e73a06: Status 404 returned error can't find the container with id 700f53af00793c02668e21c994925d39672334100797c98519a148e7b1e73a06 Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.712006 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch4ld\" (UniqueName: \"kubernetes.io/projected/cb608d96-a065-48d5-b74f-dc166ba31c08-kube-api-access-ch4ld\") pod \"redhat-operators-8h5r5\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.824079 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.894666 4935 generic.go:334] "Generic (PLEG): container finished" podID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerID="ea7b608b61c73d2fe885ba0d1188470262cf2c8cbe2eeb78d8987d1e4f2faedb" exitCode=0 Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.894768 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67fts" event={"ID":"b246facd-9d67-4a8c-9b5f-e160ae46c462","Type":"ContainerDied","Data":"ea7b608b61c73d2fe885ba0d1188470262cf2c8cbe2eeb78d8987d1e4f2faedb"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.895174 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67fts" event={"ID":"b246facd-9d67-4a8c-9b5f-e160ae46c462","Type":"ContainerStarted","Data":"e9cc185d4809f7f5b42737ba2cc8b38a451eba9e8220174d3f46e66940f40959"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.902399 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.902395 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc" event={"ID":"78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524","Type":"ContainerDied","Data":"d09d07a9758bef9937f7eb388df32ce29a4584d898d1ebdd13ccb587c7929f84"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.902549 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d09d07a9758bef9937f7eb388df32ce29a4584d898d1ebdd13ccb587c7929f84" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.906814 4935 generic.go:334] "Generic (PLEG): container finished" podID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerID="0d8b7cd7031386368d82a4ff0601d892e9bbcc17742be33802cde93e0c8c2fc1" exitCode=0 Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.906890 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kchlg" event={"ID":"72303ff8-4dec-4270-b0e6-6414b0cf0c63","Type":"ContainerDied","Data":"0d8b7cd7031386368d82a4ff0601d892e9bbcc17742be33802cde93e0c8c2fc1"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.906923 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kchlg" event={"ID":"72303ff8-4dec-4270-b0e6-6414b0cf0c63","Type":"ContainerStarted","Data":"3cae3a01352f2dcac8ffd44f468f04f765ce744ef2ac412303ffb9d208053a14"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.917769 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" event={"ID":"2aea1606-ff6f-4325-9f92-c83e2c5079c0","Type":"ContainerStarted","Data":"c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.919634 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.931393 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b28d2313-0c54-4d14-b951-9b74d5f49819","Type":"ContainerStarted","Data":"8944dfbe2956c4faefc9a6b692ef8e2321a7b45d9522b063ab9dc8d489986838"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.931464 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b28d2313-0c54-4d14-b951-9b74d5f49819","Type":"ContainerStarted","Data":"bceed74e5c3ddeb5d8e1cb2a30f292f0900698d845438948ee1e748d60809d50"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.955653 4935 generic.go:334] "Generic (PLEG): container finished" podID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerID="6da3e5f8c43110e9b0884d13ca1bd6a3471561af7e0904fca45681e5085602dc" exitCode=0 Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.956023 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjwnt" event={"ID":"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7","Type":"ContainerDied","Data":"6da3e5f8c43110e9b0884d13ca1bd6a3471561af7e0904fca45681e5085602dc"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.971072 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.971048043 podStartE2EDuration="2.971048043s" podCreationTimestamp="2025-12-17 09:07:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:13.970981562 +0000 UTC m=+153.630822325" watchObservedRunningTime="2025-12-17 09:07:13.971048043 +0000 UTC m=+153.630888806" Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.974041 4935 generic.go:334] "Generic (PLEG): container finished" podID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerID="68d5da1c9b21be3639b425fa44003d8c350f9b9cdb7301735cc934932fe41425" exitCode=0 Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.974121 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl2vl" event={"ID":"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932","Type":"ContainerDied","Data":"68d5da1c9b21be3639b425fa44003d8c350f9b9cdb7301735cc934932fe41425"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.974154 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl2vl" event={"ID":"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932","Type":"ContainerStarted","Data":"700f53af00793c02668e21c994925d39672334100797c98519a148e7b1e73a06"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.978982 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-79ntp" event={"ID":"0407e29c-77f3-481d-8642-91af970b9ef7","Type":"ContainerStarted","Data":"9d413f7a300641027375ec1190ea657231021069d4cf8dfdcc894c5afda53383"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.987816 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"56beeadf0fc706006532d862d1e8f59ea13c99a8e59cca8a2ba8b5bbad2fa5d3"} Dec 17 09:07:13 crc kubenswrapper[4935]: I1217 09:07:13.994467 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" podStartSLOduration=130.994445084 podStartE2EDuration="2m10.994445084s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:13.99315481 +0000 UTC m=+153.652995563" watchObservedRunningTime="2025-12-17 09:07:13.994445084 +0000 UTC m=+153.654285847" Dec 17 09:07:14 crc kubenswrapper[4935]: I1217 09:07:13.998946 4935 generic.go:334] "Generic (PLEG): container finished" podID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerID="4f8b0ac3583eb936564b0a6837955793166aed384b51a4620701a6ba38c2425e" exitCode=0 Dec 17 09:07:14 crc kubenswrapper[4935]: I1217 09:07:13.999668 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgs6" event={"ID":"1c7a0058-87b0-440a-b80a-ebb69f4d9370","Type":"ContainerDied","Data":"4f8b0ac3583eb936564b0a6837955793166aed384b51a4620701a6ba38c2425e"} Dec 17 09:07:14 crc kubenswrapper[4935]: I1217 09:07:14.007593 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-r8svh" Dec 17 09:07:14 crc kubenswrapper[4935]: I1217 09:07:14.103977 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-79ntp" podStartSLOduration=14.103945556 podStartE2EDuration="14.103945556s" podCreationTimestamp="2025-12-17 09:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:14.101267265 +0000 UTC m=+153.761108028" watchObservedRunningTime="2025-12-17 09:07:14.103945556 +0000 UTC m=+153.763786319" Dec 17 09:07:14 crc kubenswrapper[4935]: I1217 09:07:14.122294 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8h5r5"] Dec 17 09:07:14 crc kubenswrapper[4935]: I1217 09:07:14.333399 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:14 crc kubenswrapper[4935]: I1217 09:07:14.343986 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:14 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:14 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:14 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:14 crc kubenswrapper[4935]: I1217 09:07:14.344053 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:15 crc kubenswrapper[4935]: I1217 09:07:15.023247 4935 generic.go:334] "Generic (PLEG): container finished" podID="b28d2313-0c54-4d14-b951-9b74d5f49819" containerID="8944dfbe2956c4faefc9a6b692ef8e2321a7b45d9522b063ab9dc8d489986838" exitCode=0 Dec 17 09:07:15 crc kubenswrapper[4935]: I1217 09:07:15.023756 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b28d2313-0c54-4d14-b951-9b74d5f49819","Type":"ContainerDied","Data":"8944dfbe2956c4faefc9a6b692ef8e2321a7b45d9522b063ab9dc8d489986838"} Dec 17 09:07:15 crc kubenswrapper[4935]: I1217 09:07:15.029892 4935 generic.go:334] "Generic (PLEG): container finished" podID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerID="7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf" exitCode=0 Dec 17 09:07:15 crc kubenswrapper[4935]: I1217 09:07:15.030902 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h5r5" event={"ID":"cb608d96-a065-48d5-b74f-dc166ba31c08","Type":"ContainerDied","Data":"7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf"} Dec 17 09:07:15 crc kubenswrapper[4935]: I1217 09:07:15.030946 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h5r5" event={"ID":"cb608d96-a065-48d5-b74f-dc166ba31c08","Type":"ContainerStarted","Data":"21ab6123d8263c6f7df723077c8374b79ae1ed98f1191d2ee73e50ea2871c679"} Dec 17 09:07:15 crc kubenswrapper[4935]: I1217 09:07:15.337938 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:15 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:15 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:15 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:15 crc kubenswrapper[4935]: I1217 09:07:15.338290 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:16 crc kubenswrapper[4935]: I1217 09:07:16.295024 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 17 09:07:16 crc kubenswrapper[4935]: I1217 09:07:16.336954 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:16 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:16 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:16 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:16 crc kubenswrapper[4935]: I1217 09:07:16.337022 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:16 crc kubenswrapper[4935]: I1217 09:07:16.456018 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b28d2313-0c54-4d14-b951-9b74d5f49819-kubelet-dir\") pod \"b28d2313-0c54-4d14-b951-9b74d5f49819\" (UID: \"b28d2313-0c54-4d14-b951-9b74d5f49819\") " Dec 17 09:07:16 crc kubenswrapper[4935]: I1217 09:07:16.456122 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28d2313-0c54-4d14-b951-9b74d5f49819-kube-api-access\") pod \"b28d2313-0c54-4d14-b951-9b74d5f49819\" (UID: \"b28d2313-0c54-4d14-b951-9b74d5f49819\") " Dec 17 09:07:16 crc kubenswrapper[4935]: I1217 09:07:16.456523 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b28d2313-0c54-4d14-b951-9b74d5f49819-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b28d2313-0c54-4d14-b951-9b74d5f49819" (UID: "b28d2313-0c54-4d14-b951-9b74d5f49819"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:07:16 crc kubenswrapper[4935]: I1217 09:07:16.456908 4935 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b28d2313-0c54-4d14-b951-9b74d5f49819-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 17 09:07:16 crc kubenswrapper[4935]: I1217 09:07:16.462945 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28d2313-0c54-4d14-b951-9b74d5f49819-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b28d2313-0c54-4d14-b951-9b74d5f49819" (UID: "b28d2313-0c54-4d14-b951-9b74d5f49819"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:07:16 crc kubenswrapper[4935]: I1217 09:07:16.559294 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b28d2313-0c54-4d14-b951-9b74d5f49819-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.045458 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"b28d2313-0c54-4d14-b951-9b74d5f49819","Type":"ContainerDied","Data":"bceed74e5c3ddeb5d8e1cb2a30f292f0900698d845438948ee1e748d60809d50"} Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.045835 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bceed74e5c3ddeb5d8e1cb2a30f292f0900698d845438948ee1e748d60809d50" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.045555 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.337104 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:17 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:17 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:17 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.337217 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.474966 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 17 09:07:17 crc kubenswrapper[4935]: E1217 09:07:17.475233 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28d2313-0c54-4d14-b951-9b74d5f49819" containerName="pruner" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.475280 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28d2313-0c54-4d14-b951-9b74d5f49819" containerName="pruner" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.475394 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28d2313-0c54-4d14-b951-9b74d5f49819" containerName="pruner" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.476713 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.480187 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.480832 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.494160 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.580648 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a97be1f-d8a2-4b27-a131-e1940969bb77-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5a97be1f-d8a2-4b27-a131-e1940969bb77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.580728 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a97be1f-d8a2-4b27-a131-e1940969bb77-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5a97be1f-d8a2-4b27-a131-e1940969bb77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.682944 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a97be1f-d8a2-4b27-a131-e1940969bb77-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5a97be1f-d8a2-4b27-a131-e1940969bb77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.683022 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a97be1f-d8a2-4b27-a131-e1940969bb77-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5a97be1f-d8a2-4b27-a131-e1940969bb77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.683111 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a97be1f-d8a2-4b27-a131-e1940969bb77-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5a97be1f-d8a2-4b27-a131-e1940969bb77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.700078 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a97be1f-d8a2-4b27-a131-e1940969bb77-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5a97be1f-d8a2-4b27-a131-e1940969bb77\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 17 09:07:17 crc kubenswrapper[4935]: I1217 09:07:17.811534 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 17 09:07:18 crc kubenswrapper[4935]: I1217 09:07:18.282779 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 17 09:07:18 crc kubenswrapper[4935]: W1217 09:07:18.297196 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5a97be1f_d8a2_4b27_a131_e1940969bb77.slice/crio-647be8d20aa056e1601db0902b5fc9169f80621dc84ba75c7fa644fedf516f25 WatchSource:0}: Error finding container 647be8d20aa056e1601db0902b5fc9169f80621dc84ba75c7fa644fedf516f25: Status 404 returned error can't find the container with id 647be8d20aa056e1601db0902b5fc9169f80621dc84ba75c7fa644fedf516f25 Dec 17 09:07:18 crc kubenswrapper[4935]: I1217 09:07:18.340745 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:18 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:18 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:18 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:18 crc kubenswrapper[4935]: I1217 09:07:18.340818 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:18 crc kubenswrapper[4935]: I1217 09:07:18.574094 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:18 crc kubenswrapper[4935]: I1217 09:07:18.580583 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ckj77" Dec 17 09:07:19 crc kubenswrapper[4935]: I1217 09:07:19.063673 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5a97be1f-d8a2-4b27-a131-e1940969bb77","Type":"ContainerStarted","Data":"bc13320a07bfb9445c8d4c5d5e1d8ee602f3d823a89d8f17c3e487813230351d"} Dec 17 09:07:19 crc kubenswrapper[4935]: I1217 09:07:19.064445 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5a97be1f-d8a2-4b27-a131-e1940969bb77","Type":"ContainerStarted","Data":"647be8d20aa056e1601db0902b5fc9169f80621dc84ba75c7fa644fedf516f25"} Dec 17 09:07:19 crc kubenswrapper[4935]: I1217 09:07:19.253443 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tdxp5" Dec 17 09:07:19 crc kubenswrapper[4935]: I1217 09:07:19.350978 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:19 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:19 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:19 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:19 crc kubenswrapper[4935]: I1217 09:07:19.351053 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:20 crc kubenswrapper[4935]: I1217 09:07:20.336464 4935 patch_prober.go:28] interesting pod/router-default-5444994796-gtvxm container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 17 09:07:20 crc kubenswrapper[4935]: [-]has-synced failed: reason withheld Dec 17 09:07:20 crc kubenswrapper[4935]: [+]process-running ok Dec 17 09:07:20 crc kubenswrapper[4935]: healthz check failed Dec 17 09:07:20 crc kubenswrapper[4935]: I1217 09:07:20.336846 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gtvxm" podUID="9861106b-bcb8-49a2-93a3-14a548a26c57" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 17 09:07:21 crc kubenswrapper[4935]: I1217 09:07:21.097996 4935 generic.go:334] "Generic (PLEG): container finished" podID="5a97be1f-d8a2-4b27-a131-e1940969bb77" containerID="bc13320a07bfb9445c8d4c5d5e1d8ee602f3d823a89d8f17c3e487813230351d" exitCode=0 Dec 17 09:07:21 crc kubenswrapper[4935]: I1217 09:07:21.098057 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5a97be1f-d8a2-4b27-a131-e1940969bb77","Type":"ContainerDied","Data":"bc13320a07bfb9445c8d4c5d5e1d8ee602f3d823a89d8f17c3e487813230351d"} Dec 17 09:07:21 crc kubenswrapper[4935]: I1217 09:07:21.339662 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:21 crc kubenswrapper[4935]: I1217 09:07:21.350022 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gtvxm" Dec 17 09:07:23 crc kubenswrapper[4935]: I1217 09:07:23.193961 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-29mfs" Dec 17 09:07:23 crc kubenswrapper[4935]: I1217 09:07:23.582445 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:23 crc kubenswrapper[4935]: I1217 09:07:23.586241 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:07:25 crc kubenswrapper[4935]: I1217 09:07:25.680046 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:07:25 crc kubenswrapper[4935]: I1217 09:07:25.710264 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77feddc8-547a-42a0-baa3-19dd2915eb9f-metrics-certs\") pod \"network-metrics-daemon-rg2z5\" (UID: \"77feddc8-547a-42a0-baa3-19dd2915eb9f\") " pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:07:25 crc kubenswrapper[4935]: I1217 09:07:25.946487 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rg2z5" Dec 17 09:07:29 crc kubenswrapper[4935]: I1217 09:07:29.655026 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 17 09:07:29 crc kubenswrapper[4935]: I1217 09:07:29.754226 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a97be1f-d8a2-4b27-a131-e1940969bb77-kube-api-access\") pod \"5a97be1f-d8a2-4b27-a131-e1940969bb77\" (UID: \"5a97be1f-d8a2-4b27-a131-e1940969bb77\") " Dec 17 09:07:29 crc kubenswrapper[4935]: I1217 09:07:29.754347 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a97be1f-d8a2-4b27-a131-e1940969bb77-kubelet-dir\") pod \"5a97be1f-d8a2-4b27-a131-e1940969bb77\" (UID: \"5a97be1f-d8a2-4b27-a131-e1940969bb77\") " Dec 17 09:07:29 crc kubenswrapper[4935]: I1217 09:07:29.754531 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a97be1f-d8a2-4b27-a131-e1940969bb77-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5a97be1f-d8a2-4b27-a131-e1940969bb77" (UID: "5a97be1f-d8a2-4b27-a131-e1940969bb77"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:07:29 crc kubenswrapper[4935]: I1217 09:07:29.754858 4935 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a97be1f-d8a2-4b27-a131-e1940969bb77-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 17 09:07:29 crc kubenswrapper[4935]: I1217 09:07:29.762170 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a97be1f-d8a2-4b27-a131-e1940969bb77-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5a97be1f-d8a2-4b27-a131-e1940969bb77" (UID: "5a97be1f-d8a2-4b27-a131-e1940969bb77"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:07:29 crc kubenswrapper[4935]: I1217 09:07:29.856026 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a97be1f-d8a2-4b27-a131-e1940969bb77-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 17 09:07:30 crc kubenswrapper[4935]: I1217 09:07:30.131450 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:07:30 crc kubenswrapper[4935]: I1217 09:07:30.131939 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:07:30 crc kubenswrapper[4935]: I1217 09:07:30.201682 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5a97be1f-d8a2-4b27-a131-e1940969bb77","Type":"ContainerDied","Data":"647be8d20aa056e1601db0902b5fc9169f80621dc84ba75c7fa644fedf516f25"} Dec 17 09:07:30 crc kubenswrapper[4935]: I1217 09:07:30.202177 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="647be8d20aa056e1601db0902b5fc9169f80621dc84ba75c7fa644fedf516f25" Dec 17 09:07:30 crc kubenswrapper[4935]: I1217 09:07:30.201776 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 17 09:07:32 crc kubenswrapper[4935]: I1217 09:07:32.133526 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:07:43 crc kubenswrapper[4935]: I1217 09:07:43.783317 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-4lw55" Dec 17 09:07:44 crc kubenswrapper[4935]: I1217 09:07:44.249424 4935 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-k2b7h container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 17 09:07:44 crc kubenswrapper[4935]: I1217 09:07:44.249495 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-k2b7h" podUID="722d5ccb-2d22-453f-b6d0-8eca00275efb" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 17 09:07:48 crc kubenswrapper[4935]: E1217 09:07:48.276799 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 17 09:07:48 crc kubenswrapper[4935]: E1217 09:07:48.277646 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lwrtw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-67fts_openshift-marketplace(b246facd-9d67-4a8c-9b5f-e160ae46c462): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 17 09:07:48 crc kubenswrapper[4935]: E1217 09:07:48.278923 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-67fts" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" Dec 17 09:07:49 crc kubenswrapper[4935]: E1217 09:07:49.761759 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-67fts" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" Dec 17 09:07:49 crc kubenswrapper[4935]: E1217 09:07:49.833370 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 17 09:07:49 crc kubenswrapper[4935]: E1217 09:07:49.833592 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bjjv6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-wjwnt_openshift-marketplace(7f4e7805-d8ee-4187-bb2c-ac23a2e448b7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 17 09:07:49 crc kubenswrapper[4935]: E1217 09:07:49.834799 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-wjwnt" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" Dec 17 09:07:50 crc kubenswrapper[4935]: I1217 09:07:50.186963 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.476464 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 17 09:07:52 crc kubenswrapper[4935]: E1217 09:07:52.477606 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a97be1f-d8a2-4b27-a131-e1940969bb77" containerName="pruner" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.477626 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a97be1f-d8a2-4b27-a131-e1940969bb77" containerName="pruner" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.477762 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a97be1f-d8a2-4b27-a131-e1940969bb77" containerName="pruner" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.478639 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.484617 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.485878 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.486556 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.599834 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c72c675-1957-4b92-823b-67e228ca2a69-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7c72c675-1957-4b92-823b-67e228ca2a69\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.599917 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c72c675-1957-4b92-823b-67e228ca2a69-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7c72c675-1957-4b92-823b-67e228ca2a69\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.701097 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c72c675-1957-4b92-823b-67e228ca2a69-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7c72c675-1957-4b92-823b-67e228ca2a69\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.701592 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c72c675-1957-4b92-823b-67e228ca2a69-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7c72c675-1957-4b92-823b-67e228ca2a69\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.701264 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c72c675-1957-4b92-823b-67e228ca2a69-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7c72c675-1957-4b92-823b-67e228ca2a69\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.725501 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c72c675-1957-4b92-823b-67e228ca2a69-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7c72c675-1957-4b92-823b-67e228ca2a69\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 17 09:07:52 crc kubenswrapper[4935]: I1217 09:07:52.809628 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 17 09:07:53 crc kubenswrapper[4935]: E1217 09:07:53.735287 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-wjwnt" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" Dec 17 09:07:53 crc kubenswrapper[4935]: E1217 09:07:53.761657 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 17 09:07:53 crc kubenswrapper[4935]: E1217 09:07:53.762424 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q6jjr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vcgs6_openshift-marketplace(1c7a0058-87b0-440a-b80a-ebb69f4d9370): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 17 09:07:53 crc kubenswrapper[4935]: E1217 09:07:53.763688 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vcgs6" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" Dec 17 09:07:53 crc kubenswrapper[4935]: E1217 09:07:53.815427 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 17 09:07:53 crc kubenswrapper[4935]: E1217 09:07:53.815644 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v8dss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hl2vl_openshift-marketplace(9d81a2b6-ac3a-4c8b-8348-e471e5e5a932): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 17 09:07:53 crc kubenswrapper[4935]: E1217 09:07:53.816825 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hl2vl" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.506679 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vcgs6" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.506844 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hl2vl" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.593293 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.593519 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n79k6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-6b6bk_openshift-marketplace(157182ad-4c05-4142-9659-4d1309dceec9): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.594703 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-6b6bk" podUID="157182ad-4c05-4142-9659-4d1309dceec9" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.604266 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.604642 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ch4ld,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8h5r5_openshift-marketplace(cb608d96-a065-48d5-b74f-dc166ba31c08): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.606352 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8h5r5" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.610639 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.612914 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.613112 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-22k4b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-7zz8h_openshift-marketplace(78732594-f947-416e-a67d-2c1fd2ae310f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.613441 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bxnrj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-kchlg_openshift-marketplace(72303ff8-4dec-4270-b0e6-6414b0cf0c63): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.614310 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-7zz8h" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" Dec 17 09:07:55 crc kubenswrapper[4935]: E1217 09:07:55.615511 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-kchlg" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" Dec 17 09:07:55 crc kubenswrapper[4935]: I1217 09:07:55.944689 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rg2z5"] Dec 17 09:07:55 crc kubenswrapper[4935]: W1217 09:07:55.948304 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77feddc8_547a_42a0_baa3_19dd2915eb9f.slice/crio-f1993b8bdff28ea97a10ddc68bee873f145315761fbf43db893d549984e6e9cd WatchSource:0}: Error finding container f1993b8bdff28ea97a10ddc68bee873f145315761fbf43db893d549984e6e9cd: Status 404 returned error can't find the container with id f1993b8bdff28ea97a10ddc68bee873f145315761fbf43db893d549984e6e9cd Dec 17 09:07:56 crc kubenswrapper[4935]: I1217 09:07:56.005016 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 17 09:07:56 crc kubenswrapper[4935]: W1217 09:07:56.016594 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7c72c675_1957_4b92_823b_67e228ca2a69.slice/crio-4a7bc6cf7d10b021a9ddc63c2b5e3d9ce72622612d0fffcfa29a487ffd7a1ff7 WatchSource:0}: Error finding container 4a7bc6cf7d10b021a9ddc63c2b5e3d9ce72622612d0fffcfa29a487ffd7a1ff7: Status 404 returned error can't find the container with id 4a7bc6cf7d10b021a9ddc63c2b5e3d9ce72622612d0fffcfa29a487ffd7a1ff7 Dec 17 09:07:56 crc kubenswrapper[4935]: I1217 09:07:56.386941 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" event={"ID":"77feddc8-547a-42a0-baa3-19dd2915eb9f","Type":"ContainerStarted","Data":"c04b35c0a450a679a448743bd070e5036d3689721e83183b7f2ffd100976e699"} Dec 17 09:07:56 crc kubenswrapper[4935]: I1217 09:07:56.387576 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" event={"ID":"77feddc8-547a-42a0-baa3-19dd2915eb9f","Type":"ContainerStarted","Data":"f1993b8bdff28ea97a10ddc68bee873f145315761fbf43db893d549984e6e9cd"} Dec 17 09:07:56 crc kubenswrapper[4935]: I1217 09:07:56.389221 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7c72c675-1957-4b92-823b-67e228ca2a69","Type":"ContainerStarted","Data":"4a7bc6cf7d10b021a9ddc63c2b5e3d9ce72622612d0fffcfa29a487ffd7a1ff7"} Dec 17 09:07:56 crc kubenswrapper[4935]: E1217 09:07:56.391234 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-7zz8h" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" Dec 17 09:07:56 crc kubenswrapper[4935]: E1217 09:07:56.391209 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8h5r5" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" Dec 17 09:07:56 crc kubenswrapper[4935]: E1217 09:07:56.391656 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-6b6bk" podUID="157182ad-4c05-4142-9659-4d1309dceec9" Dec 17 09:07:56 crc kubenswrapper[4935]: E1217 09:07:56.392886 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-kchlg" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.282812 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.283939 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.292961 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.373353 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kube-api-access\") pod \"installer-9-crc\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.373428 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.373459 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-var-lock\") pod \"installer-9-crc\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.398691 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rg2z5" event={"ID":"77feddc8-547a-42a0-baa3-19dd2915eb9f","Type":"ContainerStarted","Data":"0aa47736df221d4e8608fc2bae2da505f04b985149a8e574497dfcc64bd82074"} Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.400634 4935 generic.go:334] "Generic (PLEG): container finished" podID="7c72c675-1957-4b92-823b-67e228ca2a69" containerID="9a8f439dfb5a9077eab060d818010b53d3bf7dfff2f16611d1c53f5ffa7f22c2" exitCode=0 Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.400863 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7c72c675-1957-4b92-823b-67e228ca2a69","Type":"ContainerDied","Data":"9a8f439dfb5a9077eab060d818010b53d3bf7dfff2f16611d1c53f5ffa7f22c2"} Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.413326 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rg2z5" podStartSLOduration=174.413307534 podStartE2EDuration="2m54.413307534s" podCreationTimestamp="2025-12-17 09:05:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:57.413065938 +0000 UTC m=+197.072906701" watchObservedRunningTime="2025-12-17 09:07:57.413307534 +0000 UTC m=+197.073148297" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.475068 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kube-api-access\") pod \"installer-9-crc\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.475526 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.475628 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-var-lock\") pod \"installer-9-crc\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.475701 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.475779 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-var-lock\") pod \"installer-9-crc\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.508415 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kube-api-access\") pod \"installer-9-crc\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.603175 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:07:57 crc kubenswrapper[4935]: I1217 09:07:57.995761 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 17 09:07:58 crc kubenswrapper[4935]: I1217 09:07:58.410975 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d2d337ee-6c26-4808-bebb-f41dfad7b15d","Type":"ContainerStarted","Data":"a549f4a9d4f64e0b21066ab405183454f0b6ec66938ab7f52562a60df33aa718"} Dec 17 09:07:58 crc kubenswrapper[4935]: I1217 09:07:58.411421 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d2d337ee-6c26-4808-bebb-f41dfad7b15d","Type":"ContainerStarted","Data":"b06ad615ea4a675f96c42151662a15387b86e0356cdbef0ee58d05b3771f91c4"} Dec 17 09:07:58 crc kubenswrapper[4935]: I1217 09:07:58.435911 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.435879834 podStartE2EDuration="1.435879834s" podCreationTimestamp="2025-12-17 09:07:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:07:58.43343236 +0000 UTC m=+198.093273133" watchObservedRunningTime="2025-12-17 09:07:58.435879834 +0000 UTC m=+198.095720597" Dec 17 09:07:58 crc kubenswrapper[4935]: I1217 09:07:58.681469 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 17 09:07:58 crc kubenswrapper[4935]: I1217 09:07:58.794873 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c72c675-1957-4b92-823b-67e228ca2a69-kubelet-dir\") pod \"7c72c675-1957-4b92-823b-67e228ca2a69\" (UID: \"7c72c675-1957-4b92-823b-67e228ca2a69\") " Dec 17 09:07:58 crc kubenswrapper[4935]: I1217 09:07:58.794953 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c72c675-1957-4b92-823b-67e228ca2a69-kube-api-access\") pod \"7c72c675-1957-4b92-823b-67e228ca2a69\" (UID: \"7c72c675-1957-4b92-823b-67e228ca2a69\") " Dec 17 09:07:58 crc kubenswrapper[4935]: I1217 09:07:58.795074 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c72c675-1957-4b92-823b-67e228ca2a69-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7c72c675-1957-4b92-823b-67e228ca2a69" (UID: "7c72c675-1957-4b92-823b-67e228ca2a69"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:07:58 crc kubenswrapper[4935]: I1217 09:07:58.795353 4935 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7c72c675-1957-4b92-823b-67e228ca2a69-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 17 09:07:58 crc kubenswrapper[4935]: I1217 09:07:58.816265 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c72c675-1957-4b92-823b-67e228ca2a69-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7c72c675-1957-4b92-823b-67e228ca2a69" (UID: "7c72c675-1957-4b92-823b-67e228ca2a69"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:07:58 crc kubenswrapper[4935]: I1217 09:07:58.896486 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7c72c675-1957-4b92-823b-67e228ca2a69-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 17 09:07:59 crc kubenswrapper[4935]: I1217 09:07:59.417309 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"7c72c675-1957-4b92-823b-67e228ca2a69","Type":"ContainerDied","Data":"4a7bc6cf7d10b021a9ddc63c2b5e3d9ce72622612d0fffcfa29a487ffd7a1ff7"} Dec 17 09:07:59 crc kubenswrapper[4935]: I1217 09:07:59.417346 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 17 09:07:59 crc kubenswrapper[4935]: I1217 09:07:59.417358 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a7bc6cf7d10b021a9ddc63c2b5e3d9ce72622612d0fffcfa29a487ffd7a1ff7" Dec 17 09:08:00 crc kubenswrapper[4935]: I1217 09:08:00.131192 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:08:00 crc kubenswrapper[4935]: I1217 09:08:00.131288 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:08:06 crc kubenswrapper[4935]: I1217 09:08:06.458517 4935 generic.go:334] "Generic (PLEG): container finished" podID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerID="6186659f5ebc171c54da6d1311b34cd540d81d4a047de935746c5b8458035765" exitCode=0 Dec 17 09:08:06 crc kubenswrapper[4935]: I1217 09:08:06.458656 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67fts" event={"ID":"b246facd-9d67-4a8c-9b5f-e160ae46c462","Type":"ContainerDied","Data":"6186659f5ebc171c54da6d1311b34cd540d81d4a047de935746c5b8458035765"} Dec 17 09:08:08 crc kubenswrapper[4935]: I1217 09:08:08.474591 4935 generic.go:334] "Generic (PLEG): container finished" podID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerID="d83e3ae0a69a211e2e5d2b5b7297e4c91fecb8dcfcfdc6723cd1a5ca659f2d61" exitCode=0 Dec 17 09:08:08 crc kubenswrapper[4935]: I1217 09:08:08.474687 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjwnt" event={"ID":"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7","Type":"ContainerDied","Data":"d83e3ae0a69a211e2e5d2b5b7297e4c91fecb8dcfcfdc6723cd1a5ca659f2d61"} Dec 17 09:08:08 crc kubenswrapper[4935]: I1217 09:08:08.479931 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67fts" event={"ID":"b246facd-9d67-4a8c-9b5f-e160ae46c462","Type":"ContainerStarted","Data":"0faf91f83fb95f019d6637872592f8d09b0a5e4d012868e4b8358d4f890eb41c"} Dec 17 09:08:08 crc kubenswrapper[4935]: I1217 09:08:08.483427 4935 generic.go:334] "Generic (PLEG): container finished" podID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerID="bfc4851634eb2362eac74fc19b65a799e6ffd02277c5ba4802751ad6c039eaf0" exitCode=0 Dec 17 09:08:08 crc kubenswrapper[4935]: I1217 09:08:08.483503 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kchlg" event={"ID":"72303ff8-4dec-4270-b0e6-6414b0cf0c63","Type":"ContainerDied","Data":"bfc4851634eb2362eac74fc19b65a799e6ffd02277c5ba4802751ad6c039eaf0"} Dec 17 09:08:08 crc kubenswrapper[4935]: I1217 09:08:08.516855 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-67fts" podStartSLOduration=3.036644172 podStartE2EDuration="56.516829806s" podCreationTimestamp="2025-12-17 09:07:12 +0000 UTC" firstStartedPulling="2025-12-17 09:07:13.897314449 +0000 UTC m=+153.557155212" lastFinishedPulling="2025-12-17 09:08:07.377500083 +0000 UTC m=+207.037340846" observedRunningTime="2025-12-17 09:08:08.515892582 +0000 UTC m=+208.175733345" watchObservedRunningTime="2025-12-17 09:08:08.516829806 +0000 UTC m=+208.176670569" Dec 17 09:08:09 crc kubenswrapper[4935]: I1217 09:08:09.489912 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgs6" event={"ID":"1c7a0058-87b0-440a-b80a-ebb69f4d9370","Type":"ContainerStarted","Data":"a39ba6e8927af3d8104539abf59f7aadd98c35a6be27d590b13c52175014059d"} Dec 17 09:08:10 crc kubenswrapper[4935]: I1217 09:08:10.498948 4935 generic.go:334] "Generic (PLEG): container finished" podID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerID="a39ba6e8927af3d8104539abf59f7aadd98c35a6be27d590b13c52175014059d" exitCode=0 Dec 17 09:08:10 crc kubenswrapper[4935]: I1217 09:08:10.499014 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgs6" event={"ID":"1c7a0058-87b0-440a-b80a-ebb69f4d9370","Type":"ContainerDied","Data":"a39ba6e8927af3d8104539abf59f7aadd98c35a6be27d590b13c52175014059d"} Dec 17 09:08:12 crc kubenswrapper[4935]: I1217 09:08:12.452559 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:08:12 crc kubenswrapper[4935]: I1217 09:08:12.452867 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:08:12 crc kubenswrapper[4935]: I1217 09:08:12.908891 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:08:12 crc kubenswrapper[4935]: I1217 09:08:12.950777 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:08:13 crc kubenswrapper[4935]: I1217 09:08:13.522229 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kchlg" event={"ID":"72303ff8-4dec-4270-b0e6-6414b0cf0c63","Type":"ContainerStarted","Data":"36eabaa3f811177858e2b2b506f3916c325ee9d3625afebcac0cb779d341837b"} Dec 17 09:08:13 crc kubenswrapper[4935]: I1217 09:08:13.545477 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kchlg" podStartSLOduration=2.875920535 podStartE2EDuration="1m1.545454166s" podCreationTimestamp="2025-12-17 09:07:12 +0000 UTC" firstStartedPulling="2025-12-17 09:07:13.91469137 +0000 UTC m=+153.574532133" lastFinishedPulling="2025-12-17 09:08:12.584225001 +0000 UTC m=+212.244065764" observedRunningTime="2025-12-17 09:08:13.54300412 +0000 UTC m=+213.202844883" watchObservedRunningTime="2025-12-17 09:08:13.545454166 +0000 UTC m=+213.205294929" Dec 17 09:08:15 crc kubenswrapper[4935]: I1217 09:08:15.537500 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjwnt" event={"ID":"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7","Type":"ContainerStarted","Data":"3d9316bc42eec811baa2e18bbac63a9a1ff5223c055468a52d3d70e8cbbe2e02"} Dec 17 09:08:16 crc kubenswrapper[4935]: I1217 09:08:16.571894 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wjwnt" podStartSLOduration=6.013551943 podStartE2EDuration="1m6.571860964s" podCreationTimestamp="2025-12-17 09:07:10 +0000 UTC" firstStartedPulling="2025-12-17 09:07:13.961568352 +0000 UTC m=+153.621409115" lastFinishedPulling="2025-12-17 09:08:14.519877373 +0000 UTC m=+214.179718136" observedRunningTime="2025-12-17 09:08:16.567086036 +0000 UTC m=+216.226926789" watchObservedRunningTime="2025-12-17 09:08:16.571860964 +0000 UTC m=+216.231701767" Dec 17 09:08:18 crc kubenswrapper[4935]: I1217 09:08:18.557675 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgs6" event={"ID":"1c7a0058-87b0-440a-b80a-ebb69f4d9370","Type":"ContainerStarted","Data":"e78bdbc45b95b2cb37410d9622c20716b35d691c9f760e21dbadadda1f34ea47"} Dec 17 09:08:18 crc kubenswrapper[4935]: I1217 09:08:18.560352 4935 generic.go:334] "Generic (PLEG): container finished" podID="78732594-f947-416e-a67d-2c1fd2ae310f" containerID="aca7db8b2cb3a3e8767dc37ad065d37bee0eb52edc2c7e206fd6ecabe97034ad" exitCode=0 Dec 17 09:08:18 crc kubenswrapper[4935]: I1217 09:08:18.560411 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zz8h" event={"ID":"78732594-f947-416e-a67d-2c1fd2ae310f","Type":"ContainerDied","Data":"aca7db8b2cb3a3e8767dc37ad065d37bee0eb52edc2c7e206fd6ecabe97034ad"} Dec 17 09:08:18 crc kubenswrapper[4935]: I1217 09:08:18.562764 4935 generic.go:334] "Generic (PLEG): container finished" podID="157182ad-4c05-4142-9659-4d1309dceec9" containerID="e7b57fa1e1781d8f599dd15482aa23fa4915843debd6281301e8dec2f3f60714" exitCode=0 Dec 17 09:08:18 crc kubenswrapper[4935]: I1217 09:08:18.562827 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b6bk" event={"ID":"157182ad-4c05-4142-9659-4d1309dceec9","Type":"ContainerDied","Data":"e7b57fa1e1781d8f599dd15482aa23fa4915843debd6281301e8dec2f3f60714"} Dec 17 09:08:18 crc kubenswrapper[4935]: I1217 09:08:18.565745 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h5r5" event={"ID":"cb608d96-a065-48d5-b74f-dc166ba31c08","Type":"ContainerStarted","Data":"cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393"} Dec 17 09:08:18 crc kubenswrapper[4935]: I1217 09:08:18.568688 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl2vl" event={"ID":"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932","Type":"ContainerStarted","Data":"7996df81a34313ba3b227224176d4edbdf9bba5a3f33a214e15e4de285c0a20c"} Dec 17 09:08:18 crc kubenswrapper[4935]: I1217 09:08:18.593495 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vcgs6" podStartSLOduration=4.541273604 podStartE2EDuration="1m8.593472223s" podCreationTimestamp="2025-12-17 09:07:10 +0000 UTC" firstStartedPulling="2025-12-17 09:07:14.002342743 +0000 UTC m=+153.662183506" lastFinishedPulling="2025-12-17 09:08:18.054541362 +0000 UTC m=+217.714382125" observedRunningTime="2025-12-17 09:08:18.591791208 +0000 UTC m=+218.251631981" watchObservedRunningTime="2025-12-17 09:08:18.593472223 +0000 UTC m=+218.253312986" Dec 17 09:08:19 crc kubenswrapper[4935]: I1217 09:08:19.577655 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zz8h" event={"ID":"78732594-f947-416e-a67d-2c1fd2ae310f","Type":"ContainerStarted","Data":"29346b18979f61e599e92e98fd35db1ed26e6c3eb934734ed2f4c3d0cc96a0ff"} Dec 17 09:08:19 crc kubenswrapper[4935]: I1217 09:08:19.580702 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b6bk" event={"ID":"157182ad-4c05-4142-9659-4d1309dceec9","Type":"ContainerStarted","Data":"746208ababf4df3a9afbad17a710332212e039555e6e2d97040d92a6a1a87cf0"} Dec 17 09:08:19 crc kubenswrapper[4935]: I1217 09:08:19.583371 4935 generic.go:334] "Generic (PLEG): container finished" podID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerID="cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393" exitCode=0 Dec 17 09:08:19 crc kubenswrapper[4935]: I1217 09:08:19.583453 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h5r5" event={"ID":"cb608d96-a065-48d5-b74f-dc166ba31c08","Type":"ContainerDied","Data":"cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393"} Dec 17 09:08:19 crc kubenswrapper[4935]: I1217 09:08:19.585613 4935 generic.go:334] "Generic (PLEG): container finished" podID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerID="7996df81a34313ba3b227224176d4edbdf9bba5a3f33a214e15e4de285c0a20c" exitCode=0 Dec 17 09:08:19 crc kubenswrapper[4935]: I1217 09:08:19.585649 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl2vl" event={"ID":"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932","Type":"ContainerDied","Data":"7996df81a34313ba3b227224176d4edbdf9bba5a3f33a214e15e4de285c0a20c"} Dec 17 09:08:19 crc kubenswrapper[4935]: I1217 09:08:19.616318 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6b6bk" podStartSLOduration=4.24806719 podStartE2EDuration="1m10.616255833s" podCreationTimestamp="2025-12-17 09:07:09 +0000 UTC" firstStartedPulling="2025-12-17 09:07:12.882712308 +0000 UTC m=+152.542553071" lastFinishedPulling="2025-12-17 09:08:19.250900951 +0000 UTC m=+218.910741714" observedRunningTime="2025-12-17 09:08:19.614795754 +0000 UTC m=+219.274636537" watchObservedRunningTime="2025-12-17 09:08:19.616255833 +0000 UTC m=+219.276096596" Dec 17 09:08:19 crc kubenswrapper[4935]: I1217 09:08:19.620529 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7zz8h" podStartSLOduration=3.421969229 podStartE2EDuration="1m9.620516287s" podCreationTimestamp="2025-12-17 09:07:10 +0000 UTC" firstStartedPulling="2025-12-17 09:07:12.874296855 +0000 UTC m=+152.534137618" lastFinishedPulling="2025-12-17 09:08:19.072843913 +0000 UTC m=+218.732684676" observedRunningTime="2025-12-17 09:08:19.599849305 +0000 UTC m=+219.259690068" watchObservedRunningTime="2025-12-17 09:08:19.620516287 +0000 UTC m=+219.280357050" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.244097 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.244160 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.460701 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.461152 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.514604 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.638857 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.655080 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.655137 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.949678 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.950047 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:08:20 crc kubenswrapper[4935]: I1217 09:08:20.987765 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:08:21 crc kubenswrapper[4935]: I1217 09:08:21.296621 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-6b6bk" podUID="157182ad-4c05-4142-9659-4d1309dceec9" containerName="registry-server" probeResult="failure" output=< Dec 17 09:08:21 crc kubenswrapper[4935]: timeout: failed to connect service ":50051" within 1s Dec 17 09:08:21 crc kubenswrapper[4935]: > Dec 17 09:08:21 crc kubenswrapper[4935]: I1217 09:08:21.602345 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h5r5" event={"ID":"cb608d96-a065-48d5-b74f-dc166ba31c08","Type":"ContainerStarted","Data":"a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99"} Dec 17 09:08:21 crc kubenswrapper[4935]: I1217 09:08:21.604843 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl2vl" event={"ID":"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932","Type":"ContainerStarted","Data":"f7f7b84877688777d0ca54bb8489bb35e3d4401d811aa2cade9dad2d210b3224"} Dec 17 09:08:21 crc kubenswrapper[4935]: I1217 09:08:21.623567 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8h5r5" podStartSLOduration=2.394366735 podStartE2EDuration="1m8.623542429s" podCreationTimestamp="2025-12-17 09:07:13 +0000 UTC" firstStartedPulling="2025-12-17 09:07:15.033996366 +0000 UTC m=+154.693837129" lastFinishedPulling="2025-12-17 09:08:21.26317206 +0000 UTC m=+220.923012823" observedRunningTime="2025-12-17 09:08:21.620714123 +0000 UTC m=+221.280554896" watchObservedRunningTime="2025-12-17 09:08:21.623542429 +0000 UTC m=+221.283383202" Dec 17 09:08:21 crc kubenswrapper[4935]: I1217 09:08:21.638538 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hl2vl" podStartSLOduration=1.327165187 podStartE2EDuration="1m8.638514099s" podCreationTimestamp="2025-12-17 09:07:13 +0000 UTC" firstStartedPulling="2025-12-17 09:07:13.977091113 +0000 UTC m=+153.636931876" lastFinishedPulling="2025-12-17 09:08:21.288440025 +0000 UTC m=+220.948280788" observedRunningTime="2025-12-17 09:08:21.635596641 +0000 UTC m=+221.295437404" watchObservedRunningTime="2025-12-17 09:08:21.638514099 +0000 UTC m=+221.298354852" Dec 17 09:08:21 crc kubenswrapper[4935]: I1217 09:08:21.702967 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7zz8h" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" containerName="registry-server" probeResult="failure" output=< Dec 17 09:08:21 crc kubenswrapper[4935]: timeout: failed to connect service ":50051" within 1s Dec 17 09:08:21 crc kubenswrapper[4935]: > Dec 17 09:08:22 crc kubenswrapper[4935]: I1217 09:08:22.838302 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:08:22 crc kubenswrapper[4935]: I1217 09:08:22.838697 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:08:22 crc kubenswrapper[4935]: I1217 09:08:22.887730 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:08:23 crc kubenswrapper[4935]: I1217 09:08:23.417590 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:08:23 crc kubenswrapper[4935]: I1217 09:08:23.417644 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:08:23 crc kubenswrapper[4935]: I1217 09:08:23.666656 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:08:23 crc kubenswrapper[4935]: I1217 09:08:23.825711 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:08:23 crc kubenswrapper[4935]: I1217 09:08:23.825767 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:08:24 crc kubenswrapper[4935]: I1217 09:08:24.471482 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hl2vl" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerName="registry-server" probeResult="failure" output=< Dec 17 09:08:24 crc kubenswrapper[4935]: timeout: failed to connect service ":50051" within 1s Dec 17 09:08:24 crc kubenswrapper[4935]: > Dec 17 09:08:24 crc kubenswrapper[4935]: I1217 09:08:24.862325 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8h5r5" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerName="registry-server" probeResult="failure" output=< Dec 17 09:08:24 crc kubenswrapper[4935]: timeout: failed to connect service ":50051" within 1s Dec 17 09:08:24 crc kubenswrapper[4935]: > Dec 17 09:08:26 crc kubenswrapper[4935]: I1217 09:08:26.961316 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kchlg"] Dec 17 09:08:26 crc kubenswrapper[4935]: I1217 09:08:26.961616 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kchlg" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerName="registry-server" containerID="cri-o://36eabaa3f811177858e2b2b506f3916c325ee9d3625afebcac0cb779d341837b" gracePeriod=2 Dec 17 09:08:27 crc kubenswrapper[4935]: I1217 09:08:27.641598 4935 generic.go:334] "Generic (PLEG): container finished" podID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerID="36eabaa3f811177858e2b2b506f3916c325ee9d3625afebcac0cb779d341837b" exitCode=0 Dec 17 09:08:27 crc kubenswrapper[4935]: I1217 09:08:27.641655 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kchlg" event={"ID":"72303ff8-4dec-4270-b0e6-6414b0cf0c63","Type":"ContainerDied","Data":"36eabaa3f811177858e2b2b506f3916c325ee9d3625afebcac0cb779d341837b"} Dec 17 09:08:27 crc kubenswrapper[4935]: I1217 09:08:27.814667 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:08:27 crc kubenswrapper[4935]: I1217 09:08:27.938644 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxnrj\" (UniqueName: \"kubernetes.io/projected/72303ff8-4dec-4270-b0e6-6414b0cf0c63-kube-api-access-bxnrj\") pod \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " Dec 17 09:08:27 crc kubenswrapper[4935]: I1217 09:08:27.938726 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-catalog-content\") pod \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " Dec 17 09:08:27 crc kubenswrapper[4935]: I1217 09:08:27.938783 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-utilities\") pod \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\" (UID: \"72303ff8-4dec-4270-b0e6-6414b0cf0c63\") " Dec 17 09:08:27 crc kubenswrapper[4935]: I1217 09:08:27.939657 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-utilities" (OuterVolumeSpecName: "utilities") pod "72303ff8-4dec-4270-b0e6-6414b0cf0c63" (UID: "72303ff8-4dec-4270-b0e6-6414b0cf0c63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:08:27 crc kubenswrapper[4935]: I1217 09:08:27.943990 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72303ff8-4dec-4270-b0e6-6414b0cf0c63-kube-api-access-bxnrj" (OuterVolumeSpecName: "kube-api-access-bxnrj") pod "72303ff8-4dec-4270-b0e6-6414b0cf0c63" (UID: "72303ff8-4dec-4270-b0e6-6414b0cf0c63"). InnerVolumeSpecName "kube-api-access-bxnrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:08:27 crc kubenswrapper[4935]: I1217 09:08:27.960552 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72303ff8-4dec-4270-b0e6-6414b0cf0c63" (UID: "72303ff8-4dec-4270-b0e6-6414b0cf0c63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:08:28 crc kubenswrapper[4935]: I1217 09:08:28.040315 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:28 crc kubenswrapper[4935]: I1217 09:08:28.040358 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxnrj\" (UniqueName: \"kubernetes.io/projected/72303ff8-4dec-4270-b0e6-6414b0cf0c63-kube-api-access-bxnrj\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:28 crc kubenswrapper[4935]: I1217 09:08:28.040374 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72303ff8-4dec-4270-b0e6-6414b0cf0c63-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:28 crc kubenswrapper[4935]: I1217 09:08:28.653189 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kchlg" event={"ID":"72303ff8-4dec-4270-b0e6-6414b0cf0c63","Type":"ContainerDied","Data":"3cae3a01352f2dcac8ffd44f468f04f765ce744ef2ac412303ffb9d208053a14"} Dec 17 09:08:28 crc kubenswrapper[4935]: I1217 09:08:28.653317 4935 scope.go:117] "RemoveContainer" containerID="36eabaa3f811177858e2b2b506f3916c325ee9d3625afebcac0cb779d341837b" Dec 17 09:08:28 crc kubenswrapper[4935]: I1217 09:08:28.653358 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kchlg" Dec 17 09:08:28 crc kubenswrapper[4935]: I1217 09:08:28.682184 4935 scope.go:117] "RemoveContainer" containerID="bfc4851634eb2362eac74fc19b65a799e6ffd02277c5ba4802751ad6c039eaf0" Dec 17 09:08:28 crc kubenswrapper[4935]: I1217 09:08:28.690416 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kchlg"] Dec 17 09:08:28 crc kubenswrapper[4935]: I1217 09:08:28.695813 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kchlg"] Dec 17 09:08:28 crc kubenswrapper[4935]: I1217 09:08:28.705291 4935 scope.go:117] "RemoveContainer" containerID="0d8b7cd7031386368d82a4ff0601d892e9bbcc17742be33802cde93e0c8c2fc1" Dec 17 09:08:29 crc kubenswrapper[4935]: I1217 09:08:29.132512 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" path="/var/lib/kubelet/pods/72303ff8-4dec-4270-b0e6-6414b0cf0c63/volumes" Dec 17 09:08:30 crc kubenswrapper[4935]: I1217 09:08:30.130702 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:08:30 crc kubenswrapper[4935]: I1217 09:08:30.130808 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:08:30 crc kubenswrapper[4935]: I1217 09:08:30.130919 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:08:30 crc kubenswrapper[4935]: I1217 09:08:30.131671 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:08:30 crc kubenswrapper[4935]: I1217 09:08:30.131929 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8" gracePeriod=600 Dec 17 09:08:30 crc kubenswrapper[4935]: I1217 09:08:30.319460 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:08:30 crc kubenswrapper[4935]: I1217 09:08:30.388998 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:08:30 crc kubenswrapper[4935]: I1217 09:08:30.716325 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:08:30 crc kubenswrapper[4935]: I1217 09:08:30.774056 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:08:31 crc kubenswrapper[4935]: I1217 09:08:31.005261 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:08:31 crc kubenswrapper[4935]: I1217 09:08:31.677110 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8" exitCode=0 Dec 17 09:08:31 crc kubenswrapper[4935]: I1217 09:08:31.677192 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8"} Dec 17 09:08:32 crc kubenswrapper[4935]: I1217 09:08:32.732049 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mpgxh"] Dec 17 09:08:33 crc kubenswrapper[4935]: I1217 09:08:33.359720 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zz8h"] Dec 17 09:08:33 crc kubenswrapper[4935]: I1217 09:08:33.360311 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7zz8h" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" containerName="registry-server" containerID="cri-o://29346b18979f61e599e92e98fd35db1ed26e6c3eb934734ed2f4c3d0cc96a0ff" gracePeriod=2 Dec 17 09:08:33 crc kubenswrapper[4935]: I1217 09:08:33.457336 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:08:33 crc kubenswrapper[4935]: I1217 09:08:33.520694 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:08:33 crc kubenswrapper[4935]: I1217 09:08:33.692914 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"e5e3223631fb199364a4d543a8278a4bff01bf27d2ba3883d6f2f33ee501b687"} Dec 17 09:08:33 crc kubenswrapper[4935]: I1217 09:08:33.864977 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:08:33 crc kubenswrapper[4935]: I1217 09:08:33.905498 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:08:33 crc kubenswrapper[4935]: I1217 09:08:33.960081 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vcgs6"] Dec 17 09:08:33 crc kubenswrapper[4935]: I1217 09:08:33.960370 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vcgs6" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerName="registry-server" containerID="cri-o://e78bdbc45b95b2cb37410d9622c20716b35d691c9f760e21dbadadda1f34ea47" gracePeriod=2 Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.700172 4935 generic.go:334] "Generic (PLEG): container finished" podID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerID="e78bdbc45b95b2cb37410d9622c20716b35d691c9f760e21dbadadda1f34ea47" exitCode=0 Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.700221 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgs6" event={"ID":"1c7a0058-87b0-440a-b80a-ebb69f4d9370","Type":"ContainerDied","Data":"e78bdbc45b95b2cb37410d9622c20716b35d691c9f760e21dbadadda1f34ea47"} Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.702960 4935 generic.go:334] "Generic (PLEG): container finished" podID="78732594-f947-416e-a67d-2c1fd2ae310f" containerID="29346b18979f61e599e92e98fd35db1ed26e6c3eb934734ed2f4c3d0cc96a0ff" exitCode=0 Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.703038 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zz8h" event={"ID":"78732594-f947-416e-a67d-2c1fd2ae310f","Type":"ContainerDied","Data":"29346b18979f61e599e92e98fd35db1ed26e6c3eb934734ed2f4c3d0cc96a0ff"} Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.873633 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.876300 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.944032 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-utilities\") pod \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.944107 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-catalog-content\") pod \"78732594-f947-416e-a67d-2c1fd2ae310f\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.944146 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6jjr\" (UniqueName: \"kubernetes.io/projected/1c7a0058-87b0-440a-b80a-ebb69f4d9370-kube-api-access-q6jjr\") pod \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.944181 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-utilities\") pod \"78732594-f947-416e-a67d-2c1fd2ae310f\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.944237 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22k4b\" (UniqueName: \"kubernetes.io/projected/78732594-f947-416e-a67d-2c1fd2ae310f-kube-api-access-22k4b\") pod \"78732594-f947-416e-a67d-2c1fd2ae310f\" (UID: \"78732594-f947-416e-a67d-2c1fd2ae310f\") " Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.944304 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-catalog-content\") pod \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\" (UID: \"1c7a0058-87b0-440a-b80a-ebb69f4d9370\") " Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.945073 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-utilities" (OuterVolumeSpecName: "utilities") pod "78732594-f947-416e-a67d-2c1fd2ae310f" (UID: "78732594-f947-416e-a67d-2c1fd2ae310f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.945408 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-utilities" (OuterVolumeSpecName: "utilities") pod "1c7a0058-87b0-440a-b80a-ebb69f4d9370" (UID: "1c7a0058-87b0-440a-b80a-ebb69f4d9370"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.945938 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.945972 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.952824 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c7a0058-87b0-440a-b80a-ebb69f4d9370-kube-api-access-q6jjr" (OuterVolumeSpecName: "kube-api-access-q6jjr") pod "1c7a0058-87b0-440a-b80a-ebb69f4d9370" (UID: "1c7a0058-87b0-440a-b80a-ebb69f4d9370"). InnerVolumeSpecName "kube-api-access-q6jjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:08:34 crc kubenswrapper[4935]: I1217 09:08:34.953118 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78732594-f947-416e-a67d-2c1fd2ae310f-kube-api-access-22k4b" (OuterVolumeSpecName: "kube-api-access-22k4b") pod "78732594-f947-416e-a67d-2c1fd2ae310f" (UID: "78732594-f947-416e-a67d-2c1fd2ae310f"). InnerVolumeSpecName "kube-api-access-22k4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.003903 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1c7a0058-87b0-440a-b80a-ebb69f4d9370" (UID: "1c7a0058-87b0-440a-b80a-ebb69f4d9370"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.004200 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "78732594-f947-416e-a67d-2c1fd2ae310f" (UID: "78732594-f947-416e-a67d-2c1fd2ae310f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.047075 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/78732594-f947-416e-a67d-2c1fd2ae310f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.047133 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6jjr\" (UniqueName: \"kubernetes.io/projected/1c7a0058-87b0-440a-b80a-ebb69f4d9370-kube-api-access-q6jjr\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.047151 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22k4b\" (UniqueName: \"kubernetes.io/projected/78732594-f947-416e-a67d-2c1fd2ae310f-kube-api-access-22k4b\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.047167 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c7a0058-87b0-440a-b80a-ebb69f4d9370-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.711182 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7zz8h" event={"ID":"78732594-f947-416e-a67d-2c1fd2ae310f","Type":"ContainerDied","Data":"f78666ae4b64ece1b3fa87db3b7010876bb5d2809d07032705f4c12b4fa0ed0c"} Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.711241 4935 scope.go:117] "RemoveContainer" containerID="29346b18979f61e599e92e98fd35db1ed26e6c3eb934734ed2f4c3d0cc96a0ff" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.711252 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7zz8h" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.714100 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vcgs6" event={"ID":"1c7a0058-87b0-440a-b80a-ebb69f4d9370","Type":"ContainerDied","Data":"23ce43dda980e11cac6e1e7111bd5b72d85fcdb6ab8712ef21df24bae36c766b"} Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.714195 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vcgs6" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.733704 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7zz8h"] Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.737971 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7zz8h"] Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.738637 4935 scope.go:117] "RemoveContainer" containerID="aca7db8b2cb3a3e8767dc37ad065d37bee0eb52edc2c7e206fd6ecabe97034ad" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.748891 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vcgs6"] Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.752596 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vcgs6"] Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.761338 4935 scope.go:117] "RemoveContainer" containerID="1df4e7bd26c2e457f798727ff3b7844e40e5021e168192d88439ba35d9dac9d5" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.776758 4935 scope.go:117] "RemoveContainer" containerID="e78bdbc45b95b2cb37410d9622c20716b35d691c9f760e21dbadadda1f34ea47" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.791536 4935 scope.go:117] "RemoveContainer" containerID="a39ba6e8927af3d8104539abf59f7aadd98c35a6be27d590b13c52175014059d" Dec 17 09:08:35 crc kubenswrapper[4935]: I1217 09:08:35.808209 4935 scope.go:117] "RemoveContainer" containerID="4f8b0ac3583eb936564b0a6837955793166aed384b51a4620701a6ba38c2425e" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.151911 4935 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.152225 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c72c675-1957-4b92-823b-67e228ca2a69" containerName="pruner" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152242 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c72c675-1957-4b92-823b-67e228ca2a69" containerName="pruner" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.152264 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerName="registry-server" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152294 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerName="registry-server" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.152304 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" containerName="extract-content" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152314 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" containerName="extract-content" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.152326 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerName="extract-content" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152334 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerName="extract-content" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.152348 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerName="extract-utilities" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152356 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerName="extract-utilities" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.152370 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" containerName="extract-utilities" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152378 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" containerName="extract-utilities" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.152388 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerName="registry-server" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152395 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerName="registry-server" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.152408 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" containerName="registry-server" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152416 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" containerName="registry-server" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.152428 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerName="extract-utilities" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152436 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerName="extract-utilities" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.152447 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerName="extract-content" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152455 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerName="extract-content" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152619 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="72303ff8-4dec-4270-b0e6-6414b0cf0c63" containerName="registry-server" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152634 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c72c675-1957-4b92-823b-67e228ca2a69" containerName="pruner" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152742 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" containerName="registry-server" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.152762 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" containerName="registry-server" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.153176 4935 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.153595 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696" gracePeriod=15 Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.153776 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.154222 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209" gracePeriod=15 Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.154349 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5" gracePeriod=15 Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.154441 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89" gracePeriod=15 Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.154659 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902" gracePeriod=15 Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.155456 4935 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.155783 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.155799 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.155814 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.155823 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.155834 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.155842 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.155857 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.155865 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.155879 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.155887 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.155905 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.155915 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.155925 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.155933 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.156052 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.156067 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.156079 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.156092 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.156103 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.156114 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.195170 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.262820 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.263301 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.263560 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.263599 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.263628 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.263645 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.263690 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.263716 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365702 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365798 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365833 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365860 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365881 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365910 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365906 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365969 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365998 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.366013 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365965 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.366024 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.365936 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.366046 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.366098 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.366130 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.492205 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:08:36 crc kubenswrapper[4935]: W1217 09:08:36.512241 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-99bbf755b70897a75099d5714c4f4846572b0a017522cc678985044b1695d62a WatchSource:0}: Error finding container 99bbf755b70897a75099d5714c4f4846572b0a017522cc678985044b1695d62a: Status 404 returned error can't find the container with id 99bbf755b70897a75099d5714c4f4846572b0a017522cc678985044b1695d62a Dec 17 09:08:36 crc kubenswrapper[4935]: E1217 09:08:36.515544 4935 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1881f58d638516b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-17 09:08:36.514805429 +0000 UTC m=+236.174646192,LastTimestamp:2025-12-17 09:08:36.514805429 +0000 UTC m=+236.174646192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.723054 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.724417 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.725217 4935 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209" exitCode=0 Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.725240 4935 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5" exitCode=0 Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.725248 4935 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902" exitCode=0 Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.725255 4935 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89" exitCode=2 Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.725327 4935 scope.go:117] "RemoveContainer" containerID="56ba2bbe39df980b8827bb596042c6b14ea63b3a2b88c1d7ec221736eb61cac4" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.728321 4935 generic.go:334] "Generic (PLEG): container finished" podID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" containerID="a549f4a9d4f64e0b21066ab405183454f0b6ec66938ab7f52562a60df33aa718" exitCode=0 Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.728403 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d2d337ee-6c26-4808-bebb-f41dfad7b15d","Type":"ContainerDied","Data":"a549f4a9d4f64e0b21066ab405183454f0b6ec66938ab7f52562a60df33aa718"} Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.729015 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.729210 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.729443 4935 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:36 crc kubenswrapper[4935]: I1217 09:08:36.731069 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"99bbf755b70897a75099d5714c4f4846572b0a017522cc678985044b1695d62a"} Dec 17 09:08:37 crc kubenswrapper[4935]: I1217 09:08:37.134074 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c7a0058-87b0-440a-b80a-ebb69f4d9370" path="/var/lib/kubelet/pods/1c7a0058-87b0-440a-b80a-ebb69f4d9370/volumes" Dec 17 09:08:37 crc kubenswrapper[4935]: I1217 09:08:37.135464 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78732594-f947-416e-a67d-2c1fd2ae310f" path="/var/lib/kubelet/pods/78732594-f947-416e-a67d-2c1fd2ae310f/volumes" Dec 17 09:08:37 crc kubenswrapper[4935]: I1217 09:08:37.742604 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 17 09:08:37 crc kubenswrapper[4935]: I1217 09:08:37.746662 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca"} Dec 17 09:08:37 crc kubenswrapper[4935]: I1217 09:08:37.747886 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:37 crc kubenswrapper[4935]: I1217 09:08:37.748573 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.019475 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.020436 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.020765 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.090989 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-var-lock\") pod \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.091117 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kube-api-access\") pod \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.091113 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-var-lock" (OuterVolumeSpecName: "var-lock") pod "d2d337ee-6c26-4808-bebb-f41dfad7b15d" (UID: "d2d337ee-6c26-4808-bebb-f41dfad7b15d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.091145 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kubelet-dir\") pod \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\" (UID: \"d2d337ee-6c26-4808-bebb-f41dfad7b15d\") " Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.091198 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d2d337ee-6c26-4808-bebb-f41dfad7b15d" (UID: "d2d337ee-6c26-4808-bebb-f41dfad7b15d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.091842 4935 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-var-lock\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.091862 4935 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.096585 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d2d337ee-6c26-4808-bebb-f41dfad7b15d" (UID: "d2d337ee-6c26-4808-bebb-f41dfad7b15d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.193034 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d2d337ee-6c26-4808-bebb-f41dfad7b15d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.521304 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.523159 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.524874 4935 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.525516 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.526074 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.598676 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.598806 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.598866 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.598920 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.598968 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.599014 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.599223 4935 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.599238 4935 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.599248 4935 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.757214 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d2d337ee-6c26-4808-bebb-f41dfad7b15d","Type":"ContainerDied","Data":"b06ad615ea4a675f96c42151662a15387b86e0356cdbef0ee58d05b3771f91c4"} Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.757303 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b06ad615ea4a675f96c42151662a15387b86e0356cdbef0ee58d05b3771f91c4" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.757339 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.760756 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.761681 4935 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696" exitCode=0 Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.762781 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.763004 4935 scope.go:117] "RemoveContainer" containerID="82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.777682 4935 scope.go:117] "RemoveContainer" containerID="3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.780644 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.781198 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.781552 4935 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.785088 4935 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.785336 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.785644 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.791987 4935 scope.go:117] "RemoveContainer" containerID="e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.803633 4935 scope.go:117] "RemoveContainer" containerID="b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.814961 4935 scope.go:117] "RemoveContainer" containerID="53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.827958 4935 scope.go:117] "RemoveContainer" containerID="b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.848328 4935 scope.go:117] "RemoveContainer" containerID="82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209" Dec 17 09:08:38 crc kubenswrapper[4935]: E1217 09:08:38.849504 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\": container with ID starting with 82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209 not found: ID does not exist" containerID="82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.849578 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209"} err="failed to get container status \"82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\": rpc error: code = NotFound desc = could not find container \"82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209\": container with ID starting with 82b126915099c221376a013263b900b3ecdbfa17cae207d9a248b69046c2b209 not found: ID does not exist" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.849638 4935 scope.go:117] "RemoveContainer" containerID="3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5" Dec 17 09:08:38 crc kubenswrapper[4935]: E1217 09:08:38.850178 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\": container with ID starting with 3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5 not found: ID does not exist" containerID="3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.850217 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5"} err="failed to get container status \"3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\": rpc error: code = NotFound desc = could not find container \"3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5\": container with ID starting with 3b16aab2fbbeb7428b41dbd02938e38a9159e9da46ee8d5e2006a83677803db5 not found: ID does not exist" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.850241 4935 scope.go:117] "RemoveContainer" containerID="e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902" Dec 17 09:08:38 crc kubenswrapper[4935]: E1217 09:08:38.850590 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\": container with ID starting with e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902 not found: ID does not exist" containerID="e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.850616 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902"} err="failed to get container status \"e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\": rpc error: code = NotFound desc = could not find container \"e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902\": container with ID starting with e26e5b53c60544f27ad555e6c89d66cf9bff458ef2c858258d9560e8bf45d902 not found: ID does not exist" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.850634 4935 scope.go:117] "RemoveContainer" containerID="b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89" Dec 17 09:08:38 crc kubenswrapper[4935]: E1217 09:08:38.850995 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\": container with ID starting with b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89 not found: ID does not exist" containerID="b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.851036 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89"} err="failed to get container status \"b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\": rpc error: code = NotFound desc = could not find container \"b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89\": container with ID starting with b59885d8a70f42a699b7f0f45c69ce795931e123cc7d30f62be4abd0c329fa89 not found: ID does not exist" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.851057 4935 scope.go:117] "RemoveContainer" containerID="53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696" Dec 17 09:08:38 crc kubenswrapper[4935]: E1217 09:08:38.851544 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\": container with ID starting with 53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696 not found: ID does not exist" containerID="53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.851579 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696"} err="failed to get container status \"53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\": rpc error: code = NotFound desc = could not find container \"53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696\": container with ID starting with 53a3e2e3042de7cc7f39b96a7516cf3cb9f309b07f8d21bb8f7ae4820939c696 not found: ID does not exist" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.851597 4935 scope.go:117] "RemoveContainer" containerID="b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5" Dec 17 09:08:38 crc kubenswrapper[4935]: E1217 09:08:38.851962 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\": container with ID starting with b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5 not found: ID does not exist" containerID="b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5" Dec 17 09:08:38 crc kubenswrapper[4935]: I1217 09:08:38.851998 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5"} err="failed to get container status \"b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\": rpc error: code = NotFound desc = could not find container \"b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5\": container with ID starting with b1b1e9cfd30a53263c78c8646deccac276233c7cb604b9c405b1df62a8f8e5a5 not found: ID does not exist" Dec 17 09:08:39 crc kubenswrapper[4935]: I1217 09:08:39.131550 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 17 09:08:40 crc kubenswrapper[4935]: E1217 09:08:40.126845 4935 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1881f58d638516b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-17 09:08:36.514805429 +0000 UTC m=+236.174646192,LastTimestamp:2025-12-17 09:08:36.514805429 +0000 UTC m=+236.174646192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 17 09:08:41 crc kubenswrapper[4935]: I1217 09:08:41.126468 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:41 crc kubenswrapper[4935]: I1217 09:08:41.127236 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:43 crc kubenswrapper[4935]: E1217 09:08:43.420755 4935 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:43 crc kubenswrapper[4935]: E1217 09:08:43.422258 4935 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:43 crc kubenswrapper[4935]: E1217 09:08:43.422870 4935 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:43 crc kubenswrapper[4935]: E1217 09:08:43.423232 4935 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:43 crc kubenswrapper[4935]: E1217 09:08:43.423609 4935 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:43 crc kubenswrapper[4935]: I1217 09:08:43.423668 4935 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 17 09:08:43 crc kubenswrapper[4935]: E1217 09:08:43.424104 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Dec 17 09:08:43 crc kubenswrapper[4935]: E1217 09:08:43.624657 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Dec 17 09:08:44 crc kubenswrapper[4935]: E1217 09:08:44.026197 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Dec 17 09:08:44 crc kubenswrapper[4935]: E1217 09:08:44.828049 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Dec 17 09:08:46 crc kubenswrapper[4935]: E1217 09:08:46.428765 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Dec 17 09:08:49 crc kubenswrapper[4935]: E1217 09:08:49.630966 4935 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="6.4s" Dec 17 09:08:50 crc kubenswrapper[4935]: E1217 09:08:50.128998 4935 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1881f58d638516b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-17 09:08:36.514805429 +0000 UTC m=+236.174646192,LastTimestamp:2025-12-17 09:08:36.514805429 +0000 UTC m=+236.174646192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 17 09:08:50 crc kubenswrapper[4935]: I1217 09:08:50.840923 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 17 09:08:50 crc kubenswrapper[4935]: I1217 09:08:50.841209 4935 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5" exitCode=1 Dec 17 09:08:50 crc kubenswrapper[4935]: I1217 09:08:50.841242 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5"} Dec 17 09:08:50 crc kubenswrapper[4935]: I1217 09:08:50.841746 4935 scope.go:117] "RemoveContainer" containerID="672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5" Dec 17 09:08:50 crc kubenswrapper[4935]: I1217 09:08:50.842476 4935 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:50 crc kubenswrapper[4935]: I1217 09:08:50.843223 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:50 crc kubenswrapper[4935]: I1217 09:08:50.843932 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.124156 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.127183 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.127565 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.127984 4935 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.128240 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.128500 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.128797 4935 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.139867 4935 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.139893 4935 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:08:51 crc kubenswrapper[4935]: E1217 09:08:51.140236 4935 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.140656 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:51 crc kubenswrapper[4935]: W1217 09:08:51.158775 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-5e2dccf746375252783b4adb8cc2d085413760f2ad7c965ef084a507c72ef691 WatchSource:0}: Error finding container 5e2dccf746375252783b4adb8cc2d085413760f2ad7c965ef084a507c72ef691: Status 404 returned error can't find the container with id 5e2dccf746375252783b4adb8cc2d085413760f2ad7c965ef084a507c72ef691 Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.851162 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.851243 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fd6d680258e5405d4a32d009642f095be9d56cb4cf88a85c5d185a934d1d5604"} Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.852591 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.853156 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.853439 4935 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.854421 4935 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="03bea6cd1f6904fb8015fb145aa2bffdd76cb45a616d1ee11819fd07d64c6dae" exitCode=0 Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.854458 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"03bea6cd1f6904fb8015fb145aa2bffdd76cb45a616d1ee11819fd07d64c6dae"} Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.854480 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5e2dccf746375252783b4adb8cc2d085413760f2ad7c965ef084a507c72ef691"} Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.854703 4935 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.854717 4935 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.854971 4935 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: E1217 09:08:51.855086 4935 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.855261 4935 status_manager.go:851] "Failed to get status for pod" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:51 crc kubenswrapper[4935]: I1217 09:08:51.855519 4935 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Dec 17 09:08:52 crc kubenswrapper[4935]: I1217 09:08:52.866555 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cd35d75b0ed5e8803ebfd6c6bc69aea97ef3a05692b0eef5426b34cfce753ded"} Dec 17 09:08:52 crc kubenswrapper[4935]: I1217 09:08:52.866931 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b386655b4db2d31ad5a4cd71931e30a35e8fa872eb510fed84819b3291f196d9"} Dec 17 09:08:52 crc kubenswrapper[4935]: I1217 09:08:52.866948 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"00a2315f511cc7460483080c4fa57de1d813718c2d400ea6f8328ded6708e778"} Dec 17 09:08:52 crc kubenswrapper[4935]: I1217 09:08:52.866960 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d2364ef7ea2653e551f6f9290df39b38a082668e2f5c5b920486374826b1c52a"} Dec 17 09:08:53 crc kubenswrapper[4935]: I1217 09:08:53.878403 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"737a75cd3238178d222758d186930d48e64ead6383952b5bac9dc14bd4bf98ee"} Dec 17 09:08:53 crc kubenswrapper[4935]: I1217 09:08:53.878898 4935 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:08:53 crc kubenswrapper[4935]: I1217 09:08:53.878913 4935 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:08:53 crc kubenswrapper[4935]: I1217 09:08:53.879354 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:56 crc kubenswrapper[4935]: I1217 09:08:56.070871 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:08:56 crc kubenswrapper[4935]: I1217 09:08:56.071612 4935 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 17 09:08:56 crc kubenswrapper[4935]: I1217 09:08:56.071721 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 17 09:08:56 crc kubenswrapper[4935]: I1217 09:08:56.141924 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:56 crc kubenswrapper[4935]: I1217 09:08:56.141992 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:56 crc kubenswrapper[4935]: I1217 09:08:56.148905 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:57 crc kubenswrapper[4935]: I1217 09:08:57.759059 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" podUID="e6bcddbf-eb05-4170-87db-6021b9da7df0" containerName="oauth-openshift" containerID="cri-o://0e6ad3573cd4baf01ff2cd6cca17eca16420f168dc4690a74dd17af6ac29c255" gracePeriod=15 Dec 17 09:08:57 crc kubenswrapper[4935]: I1217 09:08:57.903229 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" event={"ID":"e6bcddbf-eb05-4170-87db-6021b9da7df0","Type":"ContainerDied","Data":"0e6ad3573cd4baf01ff2cd6cca17eca16420f168dc4690a74dd17af6ac29c255"} Dec 17 09:08:57 crc kubenswrapper[4935]: I1217 09:08:57.903108 4935 generic.go:334] "Generic (PLEG): container finished" podID="e6bcddbf-eb05-4170-87db-6021b9da7df0" containerID="0e6ad3573cd4baf01ff2cd6cca17eca16420f168dc4690a74dd17af6ac29c255" exitCode=0 Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.152203 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.286478 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-trusted-ca-bundle\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.286852 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-serving-cert\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.286888 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-router-certs\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.286907 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-dir\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.286930 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-session\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.286951 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-idp-0-file-data\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.287002 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-error\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.287019 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-cliconfig\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.287040 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-provider-selection\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.287067 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-login\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.287134 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w82mt\" (UniqueName: \"kubernetes.io/projected/e6bcddbf-eb05-4170-87db-6021b9da7df0-kube-api-access-w82mt\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.287159 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-ocp-branding-template\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.287176 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-service-ca\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.287209 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-policies\") pod \"e6bcddbf-eb05-4170-87db-6021b9da7df0\" (UID: \"e6bcddbf-eb05-4170-87db-6021b9da7df0\") " Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.287454 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.287988 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.288023 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.288464 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.288862 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.294604 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.294736 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6bcddbf-eb05-4170-87db-6021b9da7df0-kube-api-access-w82mt" (OuterVolumeSpecName: "kube-api-access-w82mt") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "kube-api-access-w82mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.295809 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.295915 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.296212 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.296435 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.296638 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.296786 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.296924 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e6bcddbf-eb05-4170-87db-6021b9da7df0" (UID: "e6bcddbf-eb05-4170-87db-6021b9da7df0"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388414 4935 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388448 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388460 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388470 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388478 4935 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6bcddbf-eb05-4170-87db-6021b9da7df0-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388488 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388497 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388506 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388515 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388524 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388534 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388545 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w82mt\" (UniqueName: \"kubernetes.io/projected/e6bcddbf-eb05-4170-87db-6021b9da7df0-kube-api-access-w82mt\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388554 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.388563 4935 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6bcddbf-eb05-4170-87db-6021b9da7df0-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.888653 4935 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.911255 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" event={"ID":"e6bcddbf-eb05-4170-87db-6021b9da7df0","Type":"ContainerDied","Data":"016cd799aff8c35048a8743e006c6a6599d9f2a4f3109b354579cb31985363ab"} Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.911303 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mpgxh" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.911347 4935 scope.go:117] "RemoveContainer" containerID="0e6ad3573cd4baf01ff2cd6cca17eca16420f168dc4690a74dd17af6ac29c255" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.912210 4935 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.912296 4935 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:08:58 crc kubenswrapper[4935]: I1217 09:08:58.916196 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:08:59 crc kubenswrapper[4935]: E1217 09:08:59.168573 4935 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-idp-0-file-data\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Dec 17 09:08:59 crc kubenswrapper[4935]: E1217 09:08:59.273528 4935 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-system-cliconfig\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Dec 17 09:08:59 crc kubenswrapper[4935]: E1217 09:08:59.385691 4935 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"v4-0-config-user-template-login\": Failed to watch *v1.Secret: unknown (get secrets)" logger="UnhandledError" Dec 17 09:08:59 crc kubenswrapper[4935]: E1217 09:08:59.494410 4935 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Dec 17 09:08:59 crc kubenswrapper[4935]: I1217 09:08:59.642928 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:08:59 crc kubenswrapper[4935]: I1217 09:08:59.923169 4935 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:08:59 crc kubenswrapper[4935]: I1217 09:08:59.923201 4935 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:09:01 crc kubenswrapper[4935]: I1217 09:09:01.148489 4935 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="651d4f2b-3635-4a5f-a081-9012e643877e" Dec 17 09:09:06 crc kubenswrapper[4935]: I1217 09:09:06.071259 4935 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 17 09:09:06 crc kubenswrapper[4935]: I1217 09:09:06.071364 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 17 09:09:08 crc kubenswrapper[4935]: I1217 09:09:08.557079 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 17 09:09:08 crc kubenswrapper[4935]: I1217 09:09:08.951720 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 17 09:09:09 crc kubenswrapper[4935]: I1217 09:09:09.082572 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 17 09:09:09 crc kubenswrapper[4935]: I1217 09:09:09.825426 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 17 09:09:09 crc kubenswrapper[4935]: I1217 09:09:09.907552 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 17 09:09:09 crc kubenswrapper[4935]: I1217 09:09:09.913087 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 17 09:09:10 crc kubenswrapper[4935]: I1217 09:09:10.270948 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 17 09:09:10 crc kubenswrapper[4935]: I1217 09:09:10.596311 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 17 09:09:10 crc kubenswrapper[4935]: I1217 09:09:10.612630 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 17 09:09:10 crc kubenswrapper[4935]: I1217 09:09:10.647359 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 17 09:09:10 crc kubenswrapper[4935]: I1217 09:09:10.722317 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 17 09:09:10 crc kubenswrapper[4935]: I1217 09:09:10.730993 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 17 09:09:10 crc kubenswrapper[4935]: I1217 09:09:10.828683 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 17 09:09:10 crc kubenswrapper[4935]: I1217 09:09:10.940178 4935 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 17 09:09:10 crc kubenswrapper[4935]: I1217 09:09:10.941138 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.113110 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.276598 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.282641 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.361238 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.369872 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.559584 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.779595 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.852124 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.919940 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.961729 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 17 09:09:11 crc kubenswrapper[4935]: I1217 09:09:11.989316 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.117987 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.167815 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.314577 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.347478 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.424503 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.431649 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.445055 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.500603 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.599472 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.760758 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.837182 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.953655 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 17 09:09:12 crc kubenswrapper[4935]: I1217 09:09:12.960364 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.127017 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.365245 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.534816 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.572146 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.579524 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.625218 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.634202 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.777996 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.806264 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.839380 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.839922 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.868850 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.919203 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 17 09:09:13 crc kubenswrapper[4935]: I1217 09:09:13.998915 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.021889 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.072678 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.142196 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.186772 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.257885 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.270899 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.273445 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.279741 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.291698 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.317411 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.322794 4935 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.326397 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=38.326371073 podStartE2EDuration="38.326371073s" podCreationTimestamp="2025-12-17 09:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:08:58.928773181 +0000 UTC m=+258.588613944" watchObservedRunningTime="2025-12-17 09:09:14.326371073 +0000 UTC m=+273.986211846" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.329242 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mpgxh","openshift-kube-apiserver/kube-apiserver-crc"] Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.329378 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-84cc499644-wfvdp"] Dec 17 09:09:14 crc kubenswrapper[4935]: E1217 09:09:14.329643 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" containerName="installer" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.329671 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" containerName="installer" Dec 17 09:09:14 crc kubenswrapper[4935]: E1217 09:09:14.329691 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6bcddbf-eb05-4170-87db-6021b9da7df0" containerName="oauth-openshift" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.329701 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6bcddbf-eb05-4170-87db-6021b9da7df0" containerName="oauth-openshift" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.329831 4935 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.329864 4935 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="af86d1aa-14d6-4f22-9459-2dfffc50d347" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.329867 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2d337ee-6c26-4808-bebb-f41dfad7b15d" containerName="installer" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.329883 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6bcddbf-eb05-4170-87db-6021b9da7df0" containerName="oauth-openshift" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.330893 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.335380 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.335602 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.335898 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.337433 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.337459 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.337491 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.337856 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.338048 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.338096 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.338453 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.338769 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.338969 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.339041 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.353858 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.359994 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.366088 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.391258 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.391232504 podStartE2EDuration="16.391232504s" podCreationTimestamp="2025-12-17 09:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:09:14.388535575 +0000 UTC m=+274.048376378" watchObservedRunningTime="2025-12-17 09:09:14.391232504 +0000 UTC m=+274.051073277" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.394658 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.432424 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg2cw\" (UniqueName: \"kubernetes.io/projected/96066980-abfb-446b-a548-a810c827af3e-kube-api-access-dg2cw\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.432513 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-session\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.432555 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.432621 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.432646 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.432666 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-template-login\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.433438 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.433645 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-router-certs\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.433727 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/96066980-abfb-446b-a548-a810c827af3e-audit-dir\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.433780 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.433907 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-audit-policies\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.433978 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.434032 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-template-error\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.434069 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-service-ca\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.471580 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.510782 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.534692 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535259 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-audit-policies\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535374 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535412 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-service-ca\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535440 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-template-error\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535497 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-session\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535532 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg2cw\" (UniqueName: \"kubernetes.io/projected/96066980-abfb-446b-a548-a810c827af3e-kube-api-access-dg2cw\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535557 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535593 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535622 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535662 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-template-login\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535694 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535748 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-router-certs\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535781 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/96066980-abfb-446b-a548-a810c827af3e-audit-dir\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.535805 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.537057 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-audit-policies\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.537650 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.537651 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.537700 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/96066980-abfb-446b-a548-a810c827af3e-audit-dir\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.538160 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-service-ca\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.543954 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-template-error\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.544449 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.544492 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-template-login\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.544626 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.545063 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.545914 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-router-certs\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.546131 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-system-session\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.551973 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/96066980-abfb-446b-a548-a810c827af3e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.559580 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg2cw\" (UniqueName: \"kubernetes.io/projected/96066980-abfb-446b-a548-a810c827af3e-kube-api-access-dg2cw\") pod \"oauth-openshift-84cc499644-wfvdp\" (UID: \"96066980-abfb-446b-a548-a810c827af3e\") " pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.658225 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.723966 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.746119 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.747127 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.834095 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.873833 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 17 09:09:14 crc kubenswrapper[4935]: I1217 09:09:14.937190 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.131060 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.131583 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.138224 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6bcddbf-eb05-4170-87db-6021b9da7df0" path="/var/lib/kubelet/pods/e6bcddbf-eb05-4170-87db-6021b9da7df0/volumes" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.323476 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.458693 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.466163 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.485918 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.624520 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.645083 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.660987 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.666827 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.701043 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.803303 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.869190 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.889941 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.913621 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 17 09:09:15 crc kubenswrapper[4935]: I1217 09:09:15.929338 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.071013 4935 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.071122 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.071211 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.072206 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"fd6d680258e5405d4a32d009642f095be9d56cb4cf88a85c5d185a934d1d5604"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.072424 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://fd6d680258e5405d4a32d009642f095be9d56cb4cf88a85c5d185a934d1d5604" gracePeriod=30 Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.075071 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.251921 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.364539 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.367726 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.495641 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.508619 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.570524 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.593574 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.641373 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.664594 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.684637 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.868470 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.882359 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.939306 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 17 09:09:16 crc kubenswrapper[4935]: I1217 09:09:16.995921 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.046498 4935 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.053569 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.072064 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.084655 4935 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.090642 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.118995 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.140114 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.149028 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.154978 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.176533 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.287171 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.313872 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.354019 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.397917 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.427105 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.447000 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.491667 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.522149 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.551984 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.573225 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.585859 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.600508 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.667394 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.676993 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.699040 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.730808 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.791366 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.816773 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.870922 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 17 09:09:17 crc kubenswrapper[4935]: I1217 09:09:17.879432 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.013680 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.016178 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.057337 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.322335 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.364183 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.401006 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.431999 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.441966 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.442563 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.471512 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.598926 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.601269 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.619623 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.628790 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.648178 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.651474 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.827501 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.845248 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.847830 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.882586 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.894364 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 17 09:09:18 crc kubenswrapper[4935]: I1217 09:09:18.955133 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.051961 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.154327 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.222140 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.268044 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.279897 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.311926 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.313630 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.336001 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.337468 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.407902 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.470139 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.504646 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.533899 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.535739 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.802517 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.822442 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 17 09:09:19 crc kubenswrapper[4935]: I1217 09:09:19.883206 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.040409 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.058677 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.156180 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.227937 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.228396 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.239508 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.461345 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.489118 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.534487 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.563332 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.599870 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.628909 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.637631 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.682882 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.688698 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.869496 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 17 09:09:20 crc kubenswrapper[4935]: I1217 09:09:20.909290 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.067560 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.237207 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.284217 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.349871 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.379775 4935 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.380040 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca" gracePeriod=5 Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.399889 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.439449 4935 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.482297 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.547217 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.588555 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.652577 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-84cc499644-wfvdp"] Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.772861 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-84cc499644-wfvdp"] Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.817941 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.880868 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 17 09:09:21 crc kubenswrapper[4935]: I1217 09:09:21.914871 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.000091 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.060847 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" event={"ID":"96066980-abfb-446b-a548-a810c827af3e","Type":"ContainerStarted","Data":"c4f496f2b293b53c5447ad60dc4f0f6a4705d477dbecb2aa1d3f42b7f284e4f7"} Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.061296 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" event={"ID":"96066980-abfb-446b-a548-a810c827af3e","Type":"ContainerStarted","Data":"209b85855bce6fe532d019bcda59b68caeb576c460b4b002c61507aeb6a8f964"} Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.061538 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.063346 4935 patch_prober.go:28] interesting pod/oauth-openshift-84cc499644-wfvdp container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" start-of-body= Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.063401 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" podUID="96066980-abfb-446b-a548-a810c827af3e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.56:6443/healthz\": dial tcp 10.217.0.56:6443: connect: connection refused" Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.086555 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" podStartSLOduration=50.086534797 podStartE2EDuration="50.086534797s" podCreationTimestamp="2025-12-17 09:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:09:22.079488649 +0000 UTC m=+281.739329412" watchObservedRunningTime="2025-12-17 09:09:22.086534797 +0000 UTC m=+281.746375560" Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.106654 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.619548 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.645028 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 17 09:09:22 crc kubenswrapper[4935]: I1217 09:09:22.932921 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.012364 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.040862 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.074068 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-84cc499644-wfvdp" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.114898 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.136232 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.340077 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.340727 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.434757 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.434801 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.452378 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.507727 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.768395 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.947258 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.972049 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 17 09:09:23 crc kubenswrapper[4935]: I1217 09:09:23.984794 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 17 09:09:24 crc kubenswrapper[4935]: I1217 09:09:24.043381 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 17 09:09:24 crc kubenswrapper[4935]: I1217 09:09:24.259244 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 17 09:09:24 crc kubenswrapper[4935]: I1217 09:09:24.279622 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 17 09:09:24 crc kubenswrapper[4935]: I1217 09:09:24.393598 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 17 09:09:25 crc kubenswrapper[4935]: I1217 09:09:25.057201 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 17 09:09:25 crc kubenswrapper[4935]: I1217 09:09:25.342154 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 17 09:09:26 crc kubenswrapper[4935]: I1217 09:09:26.159718 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 17 09:09:26 crc kubenswrapper[4935]: I1217 09:09:26.960675 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 17 09:09:26 crc kubenswrapper[4935]: I1217 09:09:26.960841 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031082 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031206 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031262 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031326 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031398 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031391 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031476 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031514 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031627 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031908 4935 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031935 4935 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031952 4935 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.031969 4935 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.043459 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.101077 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.101146 4935 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca" exitCode=137 Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.101201 4935 scope.go:117] "RemoveContainer" containerID="9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.101233 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.122226 4935 scope.go:117] "RemoveContainer" containerID="9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca" Dec 17 09:09:27 crc kubenswrapper[4935]: E1217 09:09:27.123011 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca\": container with ID starting with 9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca not found: ID does not exist" containerID="9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.123075 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca"} err="failed to get container status \"9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca\": rpc error: code = NotFound desc = could not find container \"9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca\": container with ID starting with 9e5e74173760857b59d21fce353197dd899237203de13a2aa1f81511956d52ca not found: ID does not exist" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.133849 4935 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.138447 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.138912 4935 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.153700 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.153751 4935 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="7c0485b0-075e-4744-9019-2a466e1f10a6" Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.161402 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 17 09:09:27 crc kubenswrapper[4935]: I1217 09:09:27.161453 4935 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="7c0485b0-075e-4744-9019-2a466e1f10a6" Dec 17 09:09:33 crc kubenswrapper[4935]: I1217 09:09:33.691052 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 17 09:09:42 crc kubenswrapper[4935]: I1217 09:09:42.046763 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 17 09:09:42 crc kubenswrapper[4935]: I1217 09:09:42.192980 4935 generic.go:334] "Generic (PLEG): container finished" podID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerID="eed1cf006e81f4d1393aedf988cc075b6f7c13e80cbab04480af041094c2aa64" exitCode=0 Dec 17 09:09:42 crc kubenswrapper[4935]: I1217 09:09:42.193037 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" event={"ID":"60032bf5-af40-4d89-a7e3-e2e8da6382a3","Type":"ContainerDied","Data":"eed1cf006e81f4d1393aedf988cc075b6f7c13e80cbab04480af041094c2aa64"} Dec 17 09:09:42 crc kubenswrapper[4935]: I1217 09:09:42.193671 4935 scope.go:117] "RemoveContainer" containerID="eed1cf006e81f4d1393aedf988cc075b6f7c13e80cbab04480af041094c2aa64" Dec 17 09:09:43 crc kubenswrapper[4935]: I1217 09:09:43.201461 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" event={"ID":"60032bf5-af40-4d89-a7e3-e2e8da6382a3","Type":"ContainerStarted","Data":"483a9f2cc41906e1de5334669f34f788c1712841e6757f10c4832e2afd5bb9ae"} Dec 17 09:09:43 crc kubenswrapper[4935]: I1217 09:09:43.202373 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:09:43 crc kubenswrapper[4935]: I1217 09:09:43.205687 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:09:44 crc kubenswrapper[4935]: I1217 09:09:44.405616 4935 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 17 09:09:46 crc kubenswrapper[4935]: I1217 09:09:46.225084 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 17 09:09:46 crc kubenswrapper[4935]: I1217 09:09:46.228365 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 17 09:09:46 crc kubenswrapper[4935]: I1217 09:09:46.228445 4935 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fd6d680258e5405d4a32d009642f095be9d56cb4cf88a85c5d185a934d1d5604" exitCode=137 Dec 17 09:09:46 crc kubenswrapper[4935]: I1217 09:09:46.228495 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fd6d680258e5405d4a32d009642f095be9d56cb4cf88a85c5d185a934d1d5604"} Dec 17 09:09:46 crc kubenswrapper[4935]: I1217 09:09:46.228557 4935 scope.go:117] "RemoveContainer" containerID="672bf85e1c245fdabbef48dbc3f9e8bce626bb03645dcff9fa8f755eb50125b5" Dec 17 09:09:47 crc kubenswrapper[4935]: I1217 09:09:47.235409 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 17 09:09:47 crc kubenswrapper[4935]: I1217 09:09:47.236293 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b710a2a8074a814ece57edcce18ba01512878913fdcaed43f21f3ea5d662adbb"} Dec 17 09:09:49 crc kubenswrapper[4935]: I1217 09:09:49.642748 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:09:49 crc kubenswrapper[4935]: I1217 09:09:49.658033 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 17 09:09:50 crc kubenswrapper[4935]: I1217 09:09:50.257707 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 17 09:09:52 crc kubenswrapper[4935]: I1217 09:09:52.618070 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 17 09:09:56 crc kubenswrapper[4935]: I1217 09:09:56.070761 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:09:56 crc kubenswrapper[4935]: I1217 09:09:56.075564 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:09:56 crc kubenswrapper[4935]: I1217 09:09:56.297352 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 17 09:09:56 crc kubenswrapper[4935]: I1217 09:09:56.761628 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 17 09:09:58 crc kubenswrapper[4935]: I1217 09:09:58.115557 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 17 09:10:04 crc kubenswrapper[4935]: I1217 09:10:04.642363 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8h5r5"] Dec 17 09:10:04 crc kubenswrapper[4935]: I1217 09:10:04.643398 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8h5r5" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerName="registry-server" containerID="cri-o://a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99" gracePeriod=2 Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.001399 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.078241 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-catalog-content\") pod \"cb608d96-a065-48d5-b74f-dc166ba31c08\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.078324 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch4ld\" (UniqueName: \"kubernetes.io/projected/cb608d96-a065-48d5-b74f-dc166ba31c08-kube-api-access-ch4ld\") pod \"cb608d96-a065-48d5-b74f-dc166ba31c08\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.078487 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-utilities\") pod \"cb608d96-a065-48d5-b74f-dc166ba31c08\" (UID: \"cb608d96-a065-48d5-b74f-dc166ba31c08\") " Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.079210 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-utilities" (OuterVolumeSpecName: "utilities") pod "cb608d96-a065-48d5-b74f-dc166ba31c08" (UID: "cb608d96-a065-48d5-b74f-dc166ba31c08"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.084830 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb608d96-a065-48d5-b74f-dc166ba31c08-kube-api-access-ch4ld" (OuterVolumeSpecName: "kube-api-access-ch4ld") pod "cb608d96-a065-48d5-b74f-dc166ba31c08" (UID: "cb608d96-a065-48d5-b74f-dc166ba31c08"). InnerVolumeSpecName "kube-api-access-ch4ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.179727 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch4ld\" (UniqueName: \"kubernetes.io/projected/cb608d96-a065-48d5-b74f-dc166ba31c08-kube-api-access-ch4ld\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.179765 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.196634 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cb608d96-a065-48d5-b74f-dc166ba31c08" (UID: "cb608d96-a065-48d5-b74f-dc166ba31c08"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.280738 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cb608d96-a065-48d5-b74f-dc166ba31c08-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.338893 4935 generic.go:334] "Generic (PLEG): container finished" podID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerID="a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99" exitCode=0 Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.338950 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h5r5" event={"ID":"cb608d96-a065-48d5-b74f-dc166ba31c08","Type":"ContainerDied","Data":"a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99"} Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.338990 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8h5r5" event={"ID":"cb608d96-a065-48d5-b74f-dc166ba31c08","Type":"ContainerDied","Data":"21ab6123d8263c6f7df723077c8374b79ae1ed98f1191d2ee73e50ea2871c679"} Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.338989 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8h5r5" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.339080 4935 scope.go:117] "RemoveContainer" containerID="a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.356412 4935 scope.go:117] "RemoveContainer" containerID="cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.373153 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8h5r5"] Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.378323 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8h5r5"] Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.384674 4935 scope.go:117] "RemoveContainer" containerID="7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.400293 4935 scope.go:117] "RemoveContainer" containerID="a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99" Dec 17 09:10:05 crc kubenswrapper[4935]: E1217 09:10:05.400796 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99\": container with ID starting with a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99 not found: ID does not exist" containerID="a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.400839 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99"} err="failed to get container status \"a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99\": rpc error: code = NotFound desc = could not find container \"a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99\": container with ID starting with a3f84e30ef1bb7e954468bd8213a9c33e2256c9ecb105a7695ce1071d4a27b99 not found: ID does not exist" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.400865 4935 scope.go:117] "RemoveContainer" containerID="cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393" Dec 17 09:10:05 crc kubenswrapper[4935]: E1217 09:10:05.401210 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393\": container with ID starting with cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393 not found: ID does not exist" containerID="cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.401238 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393"} err="failed to get container status \"cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393\": rpc error: code = NotFound desc = could not find container \"cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393\": container with ID starting with cd10a37c02759ca38a7c5a0b2c39aa0e8621dabde8948ce792170cd034b04393 not found: ID does not exist" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.401292 4935 scope.go:117] "RemoveContainer" containerID="7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf" Dec 17 09:10:05 crc kubenswrapper[4935]: E1217 09:10:05.401584 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf\": container with ID starting with 7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf not found: ID does not exist" containerID="7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf" Dec 17 09:10:05 crc kubenswrapper[4935]: I1217 09:10:05.401613 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf"} err="failed to get container status \"7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf\": rpc error: code = NotFound desc = could not find container \"7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf\": container with ID starting with 7baeb8b13ba6213128a299353832399e9950fe27489b7dfc6158f373cb8b6baf not found: ID does not exist" Dec 17 09:10:07 crc kubenswrapper[4935]: I1217 09:10:07.131869 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" path="/var/lib/kubelet/pods/cb608d96-a065-48d5-b74f-dc166ba31c08/volumes" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.076597 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj"] Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.077733 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" podUID="8823e665-73c6-4f33-a6c2-18a8a750abb9" containerName="route-controller-manager" containerID="cri-o://062be1f48619d15ecf1ec01d58d7bd1472755c52bda1fd0f93f183ad641a75c2" gracePeriod=30 Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.080289 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v725p"] Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.080591 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" podUID="f337d441-0527-46d0-98f4-a9323a682482" containerName="controller-manager" containerID="cri-o://96bdc3667af8f814f902f782f581db0bb51e1aedc6cda40ceffc10a9b96ff1fc" gracePeriod=30 Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.412756 4935 generic.go:334] "Generic (PLEG): container finished" podID="8823e665-73c6-4f33-a6c2-18a8a750abb9" containerID="062be1f48619d15ecf1ec01d58d7bd1472755c52bda1fd0f93f183ad641a75c2" exitCode=0 Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.412900 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" event={"ID":"8823e665-73c6-4f33-a6c2-18a8a750abb9","Type":"ContainerDied","Data":"062be1f48619d15ecf1ec01d58d7bd1472755c52bda1fd0f93f183ad641a75c2"} Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.438953 4935 generic.go:334] "Generic (PLEG): container finished" podID="f337d441-0527-46d0-98f4-a9323a682482" containerID="96bdc3667af8f814f902f782f581db0bb51e1aedc6cda40ceffc10a9b96ff1fc" exitCode=0 Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.439034 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" event={"ID":"f337d441-0527-46d0-98f4-a9323a682482","Type":"ContainerDied","Data":"96bdc3667af8f814f902f782f581db0bb51e1aedc6cda40ceffc10a9b96ff1fc"} Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.460323 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.519678 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.635722 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-client-ca\") pod \"8823e665-73c6-4f33-a6c2-18a8a750abb9\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.635823 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-client-ca\") pod \"f337d441-0527-46d0-98f4-a9323a682482\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.635859 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f337d441-0527-46d0-98f4-a9323a682482-serving-cert\") pod \"f337d441-0527-46d0-98f4-a9323a682482\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.635903 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8823e665-73c6-4f33-a6c2-18a8a750abb9-serving-cert\") pod \"8823e665-73c6-4f33-a6c2-18a8a750abb9\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.635931 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-proxy-ca-bundles\") pod \"f337d441-0527-46d0-98f4-a9323a682482\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.635961 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-config\") pod \"f337d441-0527-46d0-98f4-a9323a682482\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.636004 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqcg7\" (UniqueName: \"kubernetes.io/projected/8823e665-73c6-4f33-a6c2-18a8a750abb9-kube-api-access-nqcg7\") pod \"8823e665-73c6-4f33-a6c2-18a8a750abb9\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.636059 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-config\") pod \"8823e665-73c6-4f33-a6c2-18a8a750abb9\" (UID: \"8823e665-73c6-4f33-a6c2-18a8a750abb9\") " Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.636083 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwbgc\" (UniqueName: \"kubernetes.io/projected/f337d441-0527-46d0-98f4-a9323a682482-kube-api-access-mwbgc\") pod \"f337d441-0527-46d0-98f4-a9323a682482\" (UID: \"f337d441-0527-46d0-98f4-a9323a682482\") " Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.637570 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-config" (OuterVolumeSpecName: "config") pod "f337d441-0527-46d0-98f4-a9323a682482" (UID: "f337d441-0527-46d0-98f4-a9323a682482"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.638025 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f337d441-0527-46d0-98f4-a9323a682482" (UID: "f337d441-0527-46d0-98f4-a9323a682482"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.638134 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-client-ca" (OuterVolumeSpecName: "client-ca") pod "f337d441-0527-46d0-98f4-a9323a682482" (UID: "f337d441-0527-46d0-98f4-a9323a682482"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.638214 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-client-ca" (OuterVolumeSpecName: "client-ca") pod "8823e665-73c6-4f33-a6c2-18a8a750abb9" (UID: "8823e665-73c6-4f33-a6c2-18a8a750abb9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.638311 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-config" (OuterVolumeSpecName: "config") pod "8823e665-73c6-4f33-a6c2-18a8a750abb9" (UID: "8823e665-73c6-4f33-a6c2-18a8a750abb9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.643722 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f337d441-0527-46d0-98f4-a9323a682482-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f337d441-0527-46d0-98f4-a9323a682482" (UID: "f337d441-0527-46d0-98f4-a9323a682482"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.643787 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8823e665-73c6-4f33-a6c2-18a8a750abb9-kube-api-access-nqcg7" (OuterVolumeSpecName: "kube-api-access-nqcg7") pod "8823e665-73c6-4f33-a6c2-18a8a750abb9" (UID: "8823e665-73c6-4f33-a6c2-18a8a750abb9"). InnerVolumeSpecName "kube-api-access-nqcg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.644341 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8823e665-73c6-4f33-a6c2-18a8a750abb9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8823e665-73c6-4f33-a6c2-18a8a750abb9" (UID: "8823e665-73c6-4f33-a6c2-18a8a750abb9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.644376 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f337d441-0527-46d0-98f4-a9323a682482-kube-api-access-mwbgc" (OuterVolumeSpecName: "kube-api-access-mwbgc") pod "f337d441-0527-46d0-98f4-a9323a682482" (UID: "f337d441-0527-46d0-98f4-a9323a682482"). InnerVolumeSpecName "kube-api-access-mwbgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.737436 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.737490 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwbgc\" (UniqueName: \"kubernetes.io/projected/f337d441-0527-46d0-98f4-a9323a682482-kube-api-access-mwbgc\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.737503 4935 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8823e665-73c6-4f33-a6c2-18a8a750abb9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.737513 4935 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-client-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.737522 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f337d441-0527-46d0-98f4-a9323a682482-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.737530 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8823e665-73c6-4f33-a6c2-18a8a750abb9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.737539 4935 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.737553 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f337d441-0527-46d0-98f4-a9323a682482-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:14 crc kubenswrapper[4935]: I1217 09:10:14.737561 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqcg7\" (UniqueName: \"kubernetes.io/projected/8823e665-73c6-4f33-a6c2-18a8a750abb9-kube-api-access-nqcg7\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.240384 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54"] Dec 17 09:10:15 crc kubenswrapper[4935]: E1217 09:10:15.241060 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8823e665-73c6-4f33-a6c2-18a8a750abb9" containerName="route-controller-manager" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241077 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="8823e665-73c6-4f33-a6c2-18a8a750abb9" containerName="route-controller-manager" Dec 17 09:10:15 crc kubenswrapper[4935]: E1217 09:10:15.241091 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerName="extract-utilities" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241097 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerName="extract-utilities" Dec 17 09:10:15 crc kubenswrapper[4935]: E1217 09:10:15.241108 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerName="registry-server" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241115 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerName="registry-server" Dec 17 09:10:15 crc kubenswrapper[4935]: E1217 09:10:15.241124 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerName="extract-content" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241131 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerName="extract-content" Dec 17 09:10:15 crc kubenswrapper[4935]: E1217 09:10:15.241143 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f337d441-0527-46d0-98f4-a9323a682482" containerName="controller-manager" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241150 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f337d441-0527-46d0-98f4-a9323a682482" containerName="controller-manager" Dec 17 09:10:15 crc kubenswrapper[4935]: E1217 09:10:15.241164 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241173 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241296 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="f337d441-0527-46d0-98f4-a9323a682482" containerName="controller-manager" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241313 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241324 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="8823e665-73c6-4f33-a6c2-18a8a750abb9" containerName="route-controller-manager" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241335 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb608d96-a065-48d5-b74f-dc166ba31c08" containerName="registry-server" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.241800 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.244962 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6488c7567-ltv2h"] Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.246173 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.249712 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6488c7567-ltv2h"] Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.258789 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54"] Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.345498 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-config\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.345562 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6hbp\" (UniqueName: \"kubernetes.io/projected/121b9af2-90da-4a68-9858-7980c0807055-kube-api-access-n6hbp\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.345676 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-client-ca\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.345734 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121b9af2-90da-4a68-9858-7980c0807055-serving-cert\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.345788 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-serving-cert\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.345838 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rs2l\" (UniqueName: \"kubernetes.io/projected/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-kube-api-access-4rs2l\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.345866 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-client-ca\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.345893 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-proxy-ca-bundles\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.346146 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-config\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.446062 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" event={"ID":"8823e665-73c6-4f33-a6c2-18a8a750abb9","Type":"ContainerDied","Data":"8ba8a5c7f0ae2fce25a93c55300f30cebd2f850c0ad1748424ed3a9fff071851"} Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.446086 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.446126 4935 scope.go:117] "RemoveContainer" containerID="062be1f48619d15ecf1ec01d58d7bd1472755c52bda1fd0f93f183ad641a75c2" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.446890 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-config\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.446932 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-config\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.446961 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6hbp\" (UniqueName: \"kubernetes.io/projected/121b9af2-90da-4a68-9858-7980c0807055-kube-api-access-n6hbp\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.446984 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-client-ca\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.447009 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121b9af2-90da-4a68-9858-7980c0807055-serving-cert\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.447038 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-serving-cert\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.447066 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rs2l\" (UniqueName: \"kubernetes.io/projected/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-kube-api-access-4rs2l\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.447090 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-client-ca\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.447112 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-proxy-ca-bundles\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.448456 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-proxy-ca-bundles\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.448464 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-client-ca\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.448509 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-config\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.448748 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-config\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.449735 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-client-ca\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.449874 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" event={"ID":"f337d441-0527-46d0-98f4-a9323a682482","Type":"ContainerDied","Data":"e909646f1e2c6cdd3301255c3e4eda1e7abc21113e08b0e88d9479fce5344f02"} Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.449940 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v725p" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.452847 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-serving-cert\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.457299 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121b9af2-90da-4a68-9858-7980c0807055-serving-cert\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.464480 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rs2l\" (UniqueName: \"kubernetes.io/projected/c3fb2b02-0bf9-4912-ae11-e557cd9f0b57-kube-api-access-4rs2l\") pod \"route-controller-manager-5fcd6ff745-6tm54\" (UID: \"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57\") " pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.464477 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6hbp\" (UniqueName: \"kubernetes.io/projected/121b9af2-90da-4a68-9858-7980c0807055-kube-api-access-n6hbp\") pod \"controller-manager-6488c7567-ltv2h\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.473169 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj"] Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.477564 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mmwcj"] Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.496820 4935 scope.go:117] "RemoveContainer" containerID="96bdc3667af8f814f902f782f581db0bb51e1aedc6cda40ceffc10a9b96ff1fc" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.512917 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v725p"] Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.516626 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v725p"] Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.566675 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.580885 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.802591 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6488c7567-ltv2h"] Dec 17 09:10:15 crc kubenswrapper[4935]: I1217 09:10:15.829826 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54"] Dec 17 09:10:15 crc kubenswrapper[4935]: W1217 09:10:15.834739 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3fb2b02_0bf9_4912_ae11_e557cd9f0b57.slice/crio-7dd9f99808c5ccc2b3ff66715c8705de5612d84b7295c521bf7f193623842ef8 WatchSource:0}: Error finding container 7dd9f99808c5ccc2b3ff66715c8705de5612d84b7295c521bf7f193623842ef8: Status 404 returned error can't find the container with id 7dd9f99808c5ccc2b3ff66715c8705de5612d84b7295c521bf7f193623842ef8 Dec 17 09:10:16 crc kubenswrapper[4935]: I1217 09:10:16.458554 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" event={"ID":"121b9af2-90da-4a68-9858-7980c0807055","Type":"ContainerStarted","Data":"54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5"} Dec 17 09:10:16 crc kubenswrapper[4935]: I1217 09:10:16.458619 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" event={"ID":"121b9af2-90da-4a68-9858-7980c0807055","Type":"ContainerStarted","Data":"2aa7a24e588f6919e4fcd0bad8ff3c713c62a123b6f8f9c1e5951406df0b9bd1"} Dec 17 09:10:16 crc kubenswrapper[4935]: I1217 09:10:16.458961 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:16 crc kubenswrapper[4935]: I1217 09:10:16.460036 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" event={"ID":"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57","Type":"ContainerStarted","Data":"358ea2c73eb098dea28b06411f65c3276e942fe0063211828ff17ad17f999d8b"} Dec 17 09:10:16 crc kubenswrapper[4935]: I1217 09:10:16.460090 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" event={"ID":"c3fb2b02-0bf9-4912-ae11-e557cd9f0b57","Type":"ContainerStarted","Data":"7dd9f99808c5ccc2b3ff66715c8705de5612d84b7295c521bf7f193623842ef8"} Dec 17 09:10:16 crc kubenswrapper[4935]: I1217 09:10:16.460689 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:16 crc kubenswrapper[4935]: I1217 09:10:16.467734 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:16 crc kubenswrapper[4935]: I1217 09:10:16.467794 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" Dec 17 09:10:16 crc kubenswrapper[4935]: I1217 09:10:16.499373 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" podStartSLOduration=2.499348543 podStartE2EDuration="2.499348543s" podCreationTimestamp="2025-12-17 09:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:10:16.478575468 +0000 UTC m=+336.138416231" watchObservedRunningTime="2025-12-17 09:10:16.499348543 +0000 UTC m=+336.159189306" Dec 17 09:10:16 crc kubenswrapper[4935]: I1217 09:10:16.501376 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5fcd6ff745-6tm54" podStartSLOduration=2.5013688849999998 podStartE2EDuration="2.501368885s" podCreationTimestamp="2025-12-17 09:10:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:10:16.496585143 +0000 UTC m=+336.156425906" watchObservedRunningTime="2025-12-17 09:10:16.501368885 +0000 UTC m=+336.161209648" Dec 17 09:10:17 crc kubenswrapper[4935]: I1217 09:10:17.132820 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8823e665-73c6-4f33-a6c2-18a8a750abb9" path="/var/lib/kubelet/pods/8823e665-73c6-4f33-a6c2-18a8a750abb9/volumes" Dec 17 09:10:17 crc kubenswrapper[4935]: I1217 09:10:17.133667 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f337d441-0527-46d0-98f4-a9323a682482" path="/var/lib/kubelet/pods/f337d441-0527-46d0-98f4-a9323a682482/volumes" Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.696007 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6b6bk"] Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.697330 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6b6bk" podUID="157182ad-4c05-4142-9659-4d1309dceec9" containerName="registry-server" containerID="cri-o://746208ababf4df3a9afbad17a710332212e039555e6e2d97040d92a6a1a87cf0" gracePeriod=30 Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.712701 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wjwnt"] Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.712979 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wjwnt" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerName="registry-server" containerID="cri-o://3d9316bc42eec811baa2e18bbac63a9a1ff5223c055468a52d3d70e8cbbe2e02" gracePeriod=30 Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.726035 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4xq"] Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.726368 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" podUID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerName="marketplace-operator" containerID="cri-o://483a9f2cc41906e1de5334669f34f788c1712841e6757f10c4832e2afd5bb9ae" gracePeriod=30 Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.732897 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67fts"] Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.733212 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-67fts" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerName="registry-server" containerID="cri-o://0faf91f83fb95f019d6637872592f8d09b0a5e4d012868e4b8358d4f890eb41c" gracePeriod=30 Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.739536 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zkk6r"] Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.741214 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.747603 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hl2vl"] Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.747895 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hl2vl" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerName="registry-server" containerID="cri-o://f7f7b84877688777d0ca54bb8489bb35e3d4401d811aa2cade9dad2d210b3224" gracePeriod=30 Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.752942 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zkk6r"] Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.778775 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1baaa40-be04-428b-aaca-a5235d3f167e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zkk6r\" (UID: \"f1baaa40-be04-428b-aaca-a5235d3f167e\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.778952 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lflcr\" (UniqueName: \"kubernetes.io/projected/f1baaa40-be04-428b-aaca-a5235d3f167e-kube-api-access-lflcr\") pod \"marketplace-operator-79b997595-zkk6r\" (UID: \"f1baaa40-be04-428b-aaca-a5235d3f167e\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.779102 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f1baaa40-be04-428b-aaca-a5235d3f167e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zkk6r\" (UID: \"f1baaa40-be04-428b-aaca-a5235d3f167e\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.880220 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lflcr\" (UniqueName: \"kubernetes.io/projected/f1baaa40-be04-428b-aaca-a5235d3f167e-kube-api-access-lflcr\") pod \"marketplace-operator-79b997595-zkk6r\" (UID: \"f1baaa40-be04-428b-aaca-a5235d3f167e\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.880369 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f1baaa40-be04-428b-aaca-a5235d3f167e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zkk6r\" (UID: \"f1baaa40-be04-428b-aaca-a5235d3f167e\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.880416 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1baaa40-be04-428b-aaca-a5235d3f167e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zkk6r\" (UID: \"f1baaa40-be04-428b-aaca-a5235d3f167e\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.881726 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1baaa40-be04-428b-aaca-a5235d3f167e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zkk6r\" (UID: \"f1baaa40-be04-428b-aaca-a5235d3f167e\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.887879 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f1baaa40-be04-428b-aaca-a5235d3f167e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zkk6r\" (UID: \"f1baaa40-be04-428b-aaca-a5235d3f167e\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:38 crc kubenswrapper[4935]: I1217 09:10:38.900569 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lflcr\" (UniqueName: \"kubernetes.io/projected/f1baaa40-be04-428b-aaca-a5235d3f167e-kube-api-access-lflcr\") pod \"marketplace-operator-79b997595-zkk6r\" (UID: \"f1baaa40-be04-428b-aaca-a5235d3f167e\") " pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.071816 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.475243 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zkk6r"] Dec 17 09:10:39 crc kubenswrapper[4935]: W1217 09:10:39.480948 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1baaa40_be04_428b_aaca_a5235d3f167e.slice/crio-258f9dd80637546f964ab5ede1af2e293997e783d025a62ac78e34fe6c0cda11 WatchSource:0}: Error finding container 258f9dd80637546f964ab5ede1af2e293997e783d025a62ac78e34fe6c0cda11: Status 404 returned error can't find the container with id 258f9dd80637546f964ab5ede1af2e293997e783d025a62ac78e34fe6c0cda11 Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.598511 4935 generic.go:334] "Generic (PLEG): container finished" podID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerID="f7f7b84877688777d0ca54bb8489bb35e3d4401d811aa2cade9dad2d210b3224" exitCode=0 Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.598591 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl2vl" event={"ID":"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932","Type":"ContainerDied","Data":"f7f7b84877688777d0ca54bb8489bb35e3d4401d811aa2cade9dad2d210b3224"} Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.600136 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" event={"ID":"f1baaa40-be04-428b-aaca-a5235d3f167e","Type":"ContainerStarted","Data":"258f9dd80637546f964ab5ede1af2e293997e783d025a62ac78e34fe6c0cda11"} Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.602826 4935 generic.go:334] "Generic (PLEG): container finished" podID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerID="0faf91f83fb95f019d6637872592f8d09b0a5e4d012868e4b8358d4f890eb41c" exitCode=0 Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.602900 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67fts" event={"ID":"b246facd-9d67-4a8c-9b5f-e160ae46c462","Type":"ContainerDied","Data":"0faf91f83fb95f019d6637872592f8d09b0a5e4d012868e4b8358d4f890eb41c"} Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.622946 4935 generic.go:334] "Generic (PLEG): container finished" podID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerID="483a9f2cc41906e1de5334669f34f788c1712841e6757f10c4832e2afd5bb9ae" exitCode=0 Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.623027 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" event={"ID":"60032bf5-af40-4d89-a7e3-e2e8da6382a3","Type":"ContainerDied","Data":"483a9f2cc41906e1de5334669f34f788c1712841e6757f10c4832e2afd5bb9ae"} Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.625443 4935 scope.go:117] "RemoveContainer" containerID="eed1cf006e81f4d1393aedf988cc075b6f7c13e80cbab04480af041094c2aa64" Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.635167 4935 generic.go:334] "Generic (PLEG): container finished" podID="157182ad-4c05-4142-9659-4d1309dceec9" containerID="746208ababf4df3a9afbad17a710332212e039555e6e2d97040d92a6a1a87cf0" exitCode=0 Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.635331 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b6bk" event={"ID":"157182ad-4c05-4142-9659-4d1309dceec9","Type":"ContainerDied","Data":"746208ababf4df3a9afbad17a710332212e039555e6e2d97040d92a6a1a87cf0"} Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.640619 4935 generic.go:334] "Generic (PLEG): container finished" podID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerID="3d9316bc42eec811baa2e18bbac63a9a1ff5223c055468a52d3d70e8cbbe2e02" exitCode=0 Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.640673 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjwnt" event={"ID":"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7","Type":"ContainerDied","Data":"3d9316bc42eec811baa2e18bbac63a9a1ff5223c055468a52d3d70e8cbbe2e02"} Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.888010 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.896555 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-catalog-content\") pod \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.896603 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjjv6\" (UniqueName: \"kubernetes.io/projected/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-kube-api-access-bjjv6\") pod \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.896721 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-utilities\") pod \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\" (UID: \"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7\") " Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.898681 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-utilities" (OuterVolumeSpecName: "utilities") pod "7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" (UID: "7f4e7805-d8ee-4187-bb2c-ac23a2e448b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.910095 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-kube-api-access-bjjv6" (OuterVolumeSpecName: "kube-api-access-bjjv6") pod "7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" (UID: "7f4e7805-d8ee-4187-bb2c-ac23a2e448b7"). InnerVolumeSpecName "kube-api-access-bjjv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.955799 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" (UID: "7f4e7805-d8ee-4187-bb2c-ac23a2e448b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.997584 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.997617 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:39 crc kubenswrapper[4935]: I1217 09:10:39.997630 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjjv6\" (UniqueName: \"kubernetes.io/projected/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7-kube-api-access-bjjv6\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.191898 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.199695 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-operator-metrics\") pod \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.199756 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-trusted-ca\") pod \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.199785 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-485g5\" (UniqueName: \"kubernetes.io/projected/60032bf5-af40-4d89-a7e3-e2e8da6382a3-kube-api-access-485g5\") pod \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\" (UID: \"60032bf5-af40-4d89-a7e3-e2e8da6382a3\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.200627 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "60032bf5-af40-4d89-a7e3-e2e8da6382a3" (UID: "60032bf5-af40-4d89-a7e3-e2e8da6382a3"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.201092 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.204233 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60032bf5-af40-4d89-a7e3-e2e8da6382a3-kube-api-access-485g5" (OuterVolumeSpecName: "kube-api-access-485g5") pod "60032bf5-af40-4d89-a7e3-e2e8da6382a3" (UID: "60032bf5-af40-4d89-a7e3-e2e8da6382a3"). InnerVolumeSpecName "kube-api-access-485g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.204621 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "60032bf5-af40-4d89-a7e3-e2e8da6382a3" (UID: "60032bf5-af40-4d89-a7e3-e2e8da6382a3"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.211465 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.218370 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.300985 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n79k6\" (UniqueName: \"kubernetes.io/projected/157182ad-4c05-4142-9659-4d1309dceec9-kube-api-access-n79k6\") pod \"157182ad-4c05-4142-9659-4d1309dceec9\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301053 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-catalog-content\") pod \"b246facd-9d67-4a8c-9b5f-e160ae46c462\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301123 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-utilities\") pod \"157182ad-4c05-4142-9659-4d1309dceec9\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301146 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8dss\" (UniqueName: \"kubernetes.io/projected/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-kube-api-access-v8dss\") pod \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301187 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-utilities\") pod \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301217 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-utilities\") pod \"b246facd-9d67-4a8c-9b5f-e160ae46c462\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301263 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-catalog-content\") pod \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\" (UID: \"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301312 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-catalog-content\") pod \"157182ad-4c05-4142-9659-4d1309dceec9\" (UID: \"157182ad-4c05-4142-9659-4d1309dceec9\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301353 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwrtw\" (UniqueName: \"kubernetes.io/projected/b246facd-9d67-4a8c-9b5f-e160ae46c462-kube-api-access-lwrtw\") pod \"b246facd-9d67-4a8c-9b5f-e160ae46c462\" (UID: \"b246facd-9d67-4a8c-9b5f-e160ae46c462\") " Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301612 4935 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301625 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-485g5\" (UniqueName: \"kubernetes.io/projected/60032bf5-af40-4d89-a7e3-e2e8da6382a3-kube-api-access-485g5\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.301636 4935 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/60032bf5-af40-4d89-a7e3-e2e8da6382a3-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.302493 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-utilities" (OuterVolumeSpecName: "utilities") pod "157182ad-4c05-4142-9659-4d1309dceec9" (UID: "157182ad-4c05-4142-9659-4d1309dceec9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.303472 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-utilities" (OuterVolumeSpecName: "utilities") pod "b246facd-9d67-4a8c-9b5f-e160ae46c462" (UID: "b246facd-9d67-4a8c-9b5f-e160ae46c462"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.303582 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-utilities" (OuterVolumeSpecName: "utilities") pod "9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" (UID: "9d81a2b6-ac3a-4c8b-8348-e471e5e5a932"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.304389 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/157182ad-4c05-4142-9659-4d1309dceec9-kube-api-access-n79k6" (OuterVolumeSpecName: "kube-api-access-n79k6") pod "157182ad-4c05-4142-9659-4d1309dceec9" (UID: "157182ad-4c05-4142-9659-4d1309dceec9"). InnerVolumeSpecName "kube-api-access-n79k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.304452 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-kube-api-access-v8dss" (OuterVolumeSpecName: "kube-api-access-v8dss") pod "9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" (UID: "9d81a2b6-ac3a-4c8b-8348-e471e5e5a932"). InnerVolumeSpecName "kube-api-access-v8dss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.304687 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b246facd-9d67-4a8c-9b5f-e160ae46c462-kube-api-access-lwrtw" (OuterVolumeSpecName: "kube-api-access-lwrtw") pod "b246facd-9d67-4a8c-9b5f-e160ae46c462" (UID: "b246facd-9d67-4a8c-9b5f-e160ae46c462"). InnerVolumeSpecName "kube-api-access-lwrtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.334432 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b246facd-9d67-4a8c-9b5f-e160ae46c462" (UID: "b246facd-9d67-4a8c-9b5f-e160ae46c462"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.360054 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "157182ad-4c05-4142-9659-4d1309dceec9" (UID: "157182ad-4c05-4142-9659-4d1309dceec9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.402096 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.402135 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8dss\" (UniqueName: \"kubernetes.io/projected/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-kube-api-access-v8dss\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.402145 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.402154 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.402163 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/157182ad-4c05-4142-9659-4d1309dceec9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.402172 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwrtw\" (UniqueName: \"kubernetes.io/projected/b246facd-9d67-4a8c-9b5f-e160ae46c462-kube-api-access-lwrtw\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.402180 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n79k6\" (UniqueName: \"kubernetes.io/projected/157182ad-4c05-4142-9659-4d1309dceec9-kube-api-access-n79k6\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.402188 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b246facd-9d67-4a8c-9b5f-e160ae46c462-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.437075 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" (UID: "9d81a2b6-ac3a-4c8b-8348-e471e5e5a932"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.503455 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.647896 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67fts" event={"ID":"b246facd-9d67-4a8c-9b5f-e160ae46c462","Type":"ContainerDied","Data":"e9cc185d4809f7f5b42737ba2cc8b38a451eba9e8220174d3f46e66940f40959"} Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.647954 4935 scope.go:117] "RemoveContainer" containerID="0faf91f83fb95f019d6637872592f8d09b0a5e4d012868e4b8358d4f890eb41c" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.648774 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67fts" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.649921 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" event={"ID":"60032bf5-af40-4d89-a7e3-e2e8da6382a3","Type":"ContainerDied","Data":"bb1c28a02bd55017871486b3e2d34f607c3476258757da1524eda93c754973a1"} Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.649995 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5j4xq" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.665995 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6b6bk" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.666052 4935 scope.go:117] "RemoveContainer" containerID="6186659f5ebc171c54da6d1311b34cd540d81d4a047de935746c5b8458035765" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.666153 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6b6bk" event={"ID":"157182ad-4c05-4142-9659-4d1309dceec9","Type":"ContainerDied","Data":"1cebe785aa00c37c774a0c2c2eb46df23217d362171dcee9c567ddcca40208f2"} Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.675051 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wjwnt" event={"ID":"7f4e7805-d8ee-4187-bb2c-ac23a2e448b7","Type":"ContainerDied","Data":"fa1612f497e62c5abe138ffd12f070806981fa57910517eb62f7111784de6c48"} Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.675151 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wjwnt" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.677028 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" event={"ID":"f1baaa40-be04-428b-aaca-a5235d3f167e","Type":"ContainerStarted","Data":"dd6b4e93ec91303b3fb30b8f2f64dfbadc6c5f7b0e93281b7ef4265d5a0388c2"} Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.679866 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hl2vl" event={"ID":"9d81a2b6-ac3a-4c8b-8348-e471e5e5a932","Type":"ContainerDied","Data":"700f53af00793c02668e21c994925d39672334100797c98519a148e7b1e73a06"} Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.680032 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hl2vl" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.693820 4935 scope.go:117] "RemoveContainer" containerID="ea7b608b61c73d2fe885ba0d1188470262cf2c8cbe2eeb78d8987d1e4f2faedb" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.696949 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67fts"] Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.702385 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-67fts"] Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.718540 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" podStartSLOduration=2.718498585 podStartE2EDuration="2.718498585s" podCreationTimestamp="2025-12-17 09:10:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:10:40.710692827 +0000 UTC m=+360.370533610" watchObservedRunningTime="2025-12-17 09:10:40.718498585 +0000 UTC m=+360.378339348" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.732042 4935 scope.go:117] "RemoveContainer" containerID="483a9f2cc41906e1de5334669f34f788c1712841e6757f10c4832e2afd5bb9ae" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.744939 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4xq"] Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.753210 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5j4xq"] Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.758202 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hl2vl"] Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.763249 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hl2vl"] Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.766443 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6b6bk"] Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.769067 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6b6bk"] Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.770440 4935 scope.go:117] "RemoveContainer" containerID="746208ababf4df3a9afbad17a710332212e039555e6e2d97040d92a6a1a87cf0" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.785027 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wjwnt"] Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.787802 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wjwnt"] Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.794615 4935 scope.go:117] "RemoveContainer" containerID="e7b57fa1e1781d8f599dd15482aa23fa4915843debd6281301e8dec2f3f60714" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.810767 4935 scope.go:117] "RemoveContainer" containerID="248b15934de31bbfd86db5d91efd0f8cfe0d1b3d0ae2a4839476ed0cc45c44aa" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.854778 4935 scope.go:117] "RemoveContainer" containerID="3d9316bc42eec811baa2e18bbac63a9a1ff5223c055468a52d3d70e8cbbe2e02" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.871588 4935 scope.go:117] "RemoveContainer" containerID="d83e3ae0a69a211e2e5d2b5b7297e4c91fecb8dcfcfdc6723cd1a5ca659f2d61" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.886544 4935 scope.go:117] "RemoveContainer" containerID="6da3e5f8c43110e9b0884d13ca1bd6a3471561af7e0904fca45681e5085602dc" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.908238 4935 scope.go:117] "RemoveContainer" containerID="f7f7b84877688777d0ca54bb8489bb35e3d4401d811aa2cade9dad2d210b3224" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.923471 4935 scope.go:117] "RemoveContainer" containerID="7996df81a34313ba3b227224176d4edbdf9bba5a3f33a214e15e4de285c0a20c" Dec 17 09:10:40 crc kubenswrapper[4935]: I1217 09:10:40.940017 4935 scope.go:117] "RemoveContainer" containerID="68d5da1c9b21be3639b425fa44003d8c350f9b9cdb7301735cc934932fe41425" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.133624 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="157182ad-4c05-4142-9659-4d1309dceec9" path="/var/lib/kubelet/pods/157182ad-4c05-4142-9659-4d1309dceec9/volumes" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.134746 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" path="/var/lib/kubelet/pods/60032bf5-af40-4d89-a7e3-e2e8da6382a3/volumes" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.135946 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" path="/var/lib/kubelet/pods/7f4e7805-d8ee-4187-bb2c-ac23a2e448b7/volumes" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.136913 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" path="/var/lib/kubelet/pods/9d81a2b6-ac3a-4c8b-8348-e471e5e5a932/volumes" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.138385 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" path="/var/lib/kubelet/pods/b246facd-9d67-4a8c-9b5f-e160ae46c462/volumes" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.652556 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s9kxf"] Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.652829 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerName="extract-content" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.652845 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerName="extract-content" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.652862 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157182ad-4c05-4142-9659-4d1309dceec9" containerName="extract-content" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.652870 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="157182ad-4c05-4142-9659-4d1309dceec9" containerName="extract-content" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.652882 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerName="extract-content" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.652889 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerName="extract-content" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.652905 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.652912 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.652921 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerName="marketplace-operator" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.652928 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerName="marketplace-operator" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.652939 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerName="extract-content" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.652946 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerName="extract-content" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.652954 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.652961 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.652971 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.652977 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.652987 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerName="extract-utilities" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.652995 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerName="extract-utilities" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.653003 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157182ad-4c05-4142-9659-4d1309dceec9" containerName="extract-utilities" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653010 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="157182ad-4c05-4142-9659-4d1309dceec9" containerName="extract-utilities" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.653021 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="157182ad-4c05-4142-9659-4d1309dceec9" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653028 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="157182ad-4c05-4142-9659-4d1309dceec9" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.653037 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerName="marketplace-operator" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653045 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerName="marketplace-operator" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.653053 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerName="extract-utilities" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653060 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerName="extract-utilities" Dec 17 09:10:41 crc kubenswrapper[4935]: E1217 09:10:41.653068 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerName="extract-utilities" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653075 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerName="extract-utilities" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653194 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d81a2b6-ac3a-4c8b-8348-e471e5e5a932" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653206 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="b246facd-9d67-4a8c-9b5f-e160ae46c462" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653215 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f4e7805-d8ee-4187-bb2c-ac23a2e448b7" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653227 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="157182ad-4c05-4142-9659-4d1309dceec9" containerName="registry-server" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653236 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerName="marketplace-operator" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.653468 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="60032bf5-af40-4d89-a7e3-e2e8da6382a3" containerName="marketplace-operator" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.654235 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.656580 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.661610 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s9kxf"] Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.685935 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.687983 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zkk6r" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.729165 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13e7e638-f648-4b5f-9589-9258a45be193-utilities\") pod \"community-operators-s9kxf\" (UID: \"13e7e638-f648-4b5f-9589-9258a45be193\") " pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.729264 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13e7e638-f648-4b5f-9589-9258a45be193-catalog-content\") pod \"community-operators-s9kxf\" (UID: \"13e7e638-f648-4b5f-9589-9258a45be193\") " pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.729422 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6dm\" (UniqueName: \"kubernetes.io/projected/13e7e638-f648-4b5f-9589-9258a45be193-kube-api-access-9r6dm\") pod \"community-operators-s9kxf\" (UID: \"13e7e638-f648-4b5f-9589-9258a45be193\") " pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.830358 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6dm\" (UniqueName: \"kubernetes.io/projected/13e7e638-f648-4b5f-9589-9258a45be193-kube-api-access-9r6dm\") pod \"community-operators-s9kxf\" (UID: \"13e7e638-f648-4b5f-9589-9258a45be193\") " pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.830463 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13e7e638-f648-4b5f-9589-9258a45be193-utilities\") pod \"community-operators-s9kxf\" (UID: \"13e7e638-f648-4b5f-9589-9258a45be193\") " pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.830497 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13e7e638-f648-4b5f-9589-9258a45be193-catalog-content\") pod \"community-operators-s9kxf\" (UID: \"13e7e638-f648-4b5f-9589-9258a45be193\") " pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.831093 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13e7e638-f648-4b5f-9589-9258a45be193-utilities\") pod \"community-operators-s9kxf\" (UID: \"13e7e638-f648-4b5f-9589-9258a45be193\") " pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.831374 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13e7e638-f648-4b5f-9589-9258a45be193-catalog-content\") pod \"community-operators-s9kxf\" (UID: \"13e7e638-f648-4b5f-9589-9258a45be193\") " pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.851800 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6dm\" (UniqueName: \"kubernetes.io/projected/13e7e638-f648-4b5f-9589-9258a45be193-kube-api-access-9r6dm\") pod \"community-operators-s9kxf\" (UID: \"13e7e638-f648-4b5f-9589-9258a45be193\") " pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:41 crc kubenswrapper[4935]: I1217 09:10:41.977763 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.382605 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s9kxf"] Dec 17 09:10:42 crc kubenswrapper[4935]: W1217 09:10:42.388931 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13e7e638_f648_4b5f_9589_9258a45be193.slice/crio-85b9cb7bb92f02223c1b2d4d7f3f085794a0ad3e090eb21209684eb91ac3884e WatchSource:0}: Error finding container 85b9cb7bb92f02223c1b2d4d7f3f085794a0ad3e090eb21209684eb91ac3884e: Status 404 returned error can't find the container with id 85b9cb7bb92f02223c1b2d4d7f3f085794a0ad3e090eb21209684eb91ac3884e Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.589040 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-67jgk"] Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.594842 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.610642 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-67jgk"] Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.649803 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-55vfx"] Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.651024 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.653024 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.663339 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-55vfx"] Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.694376 4935 generic.go:334] "Generic (PLEG): container finished" podID="13e7e638-f648-4b5f-9589-9258a45be193" containerID="26015075ec13096de211b4e9155e2cd42c06908686b3afde82e881beaefdc57e" exitCode=0 Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.694474 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9kxf" event={"ID":"13e7e638-f648-4b5f-9589-9258a45be193","Type":"ContainerDied","Data":"26015075ec13096de211b4e9155e2cd42c06908686b3afde82e881beaefdc57e"} Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.694529 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9kxf" event={"ID":"13e7e638-f648-4b5f-9589-9258a45be193","Type":"ContainerStarted","Data":"85b9cb7bb92f02223c1b2d4d7f3f085794a0ad3e090eb21209684eb91ac3884e"} Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.744031 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/043d4fa9-df38-43d5-82a3-47193d40b932-bound-sa-token\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.744105 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/043d4fa9-df38-43d5-82a3-47193d40b932-installation-pull-secrets\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.744265 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/043d4fa9-df38-43d5-82a3-47193d40b932-trusted-ca\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.744377 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.744408 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/043d4fa9-df38-43d5-82a3-47193d40b932-registry-tls\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.744485 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/043d4fa9-df38-43d5-82a3-47193d40b932-ca-trust-extracted\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.744533 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mpgw\" (UniqueName: \"kubernetes.io/projected/043d4fa9-df38-43d5-82a3-47193d40b932-kube-api-access-7mpgw\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.744585 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/043d4fa9-df38-43d5-82a3-47193d40b932-registry-certificates\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.768289 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.845755 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/043d4fa9-df38-43d5-82a3-47193d40b932-registry-tls\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.845831 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/043d4fa9-df38-43d5-82a3-47193d40b932-ca-trust-extracted\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.845860 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe737c1-a407-4da3-a492-a49c892b1db9-utilities\") pod \"redhat-operators-55vfx\" (UID: \"2fe737c1-a407-4da3-a492-a49c892b1db9\") " pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.845883 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mpgw\" (UniqueName: \"kubernetes.io/projected/043d4fa9-df38-43d5-82a3-47193d40b932-kube-api-access-7mpgw\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.845910 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/043d4fa9-df38-43d5-82a3-47193d40b932-registry-certificates\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.845959 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe737c1-a407-4da3-a492-a49c892b1db9-catalog-content\") pod \"redhat-operators-55vfx\" (UID: \"2fe737c1-a407-4da3-a492-a49c892b1db9\") " pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.846002 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/043d4fa9-df38-43d5-82a3-47193d40b932-bound-sa-token\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.846022 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsmvs\" (UniqueName: \"kubernetes.io/projected/2fe737c1-a407-4da3-a492-a49c892b1db9-kube-api-access-bsmvs\") pod \"redhat-operators-55vfx\" (UID: \"2fe737c1-a407-4da3-a492-a49c892b1db9\") " pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.846047 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/043d4fa9-df38-43d5-82a3-47193d40b932-installation-pull-secrets\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.846083 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/043d4fa9-df38-43d5-82a3-47193d40b932-trusted-ca\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.846369 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/043d4fa9-df38-43d5-82a3-47193d40b932-ca-trust-extracted\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.847202 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/043d4fa9-df38-43d5-82a3-47193d40b932-trusted-ca\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.847298 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/043d4fa9-df38-43d5-82a3-47193d40b932-registry-certificates\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.852168 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/043d4fa9-df38-43d5-82a3-47193d40b932-installation-pull-secrets\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.852200 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/043d4fa9-df38-43d5-82a3-47193d40b932-registry-tls\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.864068 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mpgw\" (UniqueName: \"kubernetes.io/projected/043d4fa9-df38-43d5-82a3-47193d40b932-kube-api-access-7mpgw\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.864217 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/043d4fa9-df38-43d5-82a3-47193d40b932-bound-sa-token\") pod \"image-registry-66df7c8f76-67jgk\" (UID: \"043d4fa9-df38-43d5-82a3-47193d40b932\") " pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.923374 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.947622 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe737c1-a407-4da3-a492-a49c892b1db9-catalog-content\") pod \"redhat-operators-55vfx\" (UID: \"2fe737c1-a407-4da3-a492-a49c892b1db9\") " pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.948202 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsmvs\" (UniqueName: \"kubernetes.io/projected/2fe737c1-a407-4da3-a492-a49c892b1db9-kube-api-access-bsmvs\") pod \"redhat-operators-55vfx\" (UID: \"2fe737c1-a407-4da3-a492-a49c892b1db9\") " pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.948233 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fe737c1-a407-4da3-a492-a49c892b1db9-catalog-content\") pod \"redhat-operators-55vfx\" (UID: \"2fe737c1-a407-4da3-a492-a49c892b1db9\") " pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.948329 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe737c1-a407-4da3-a492-a49c892b1db9-utilities\") pod \"redhat-operators-55vfx\" (UID: \"2fe737c1-a407-4da3-a492-a49c892b1db9\") " pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.948672 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fe737c1-a407-4da3-a492-a49c892b1db9-utilities\") pod \"redhat-operators-55vfx\" (UID: \"2fe737c1-a407-4da3-a492-a49c892b1db9\") " pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.968632 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsmvs\" (UniqueName: \"kubernetes.io/projected/2fe737c1-a407-4da3-a492-a49c892b1db9-kube-api-access-bsmvs\") pod \"redhat-operators-55vfx\" (UID: \"2fe737c1-a407-4da3-a492-a49c892b1db9\") " pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:42 crc kubenswrapper[4935]: I1217 09:10:42.969061 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:43 crc kubenswrapper[4935]: I1217 09:10:43.342476 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-67jgk"] Dec 17 09:10:43 crc kubenswrapper[4935]: W1217 09:10:43.351247 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod043d4fa9_df38_43d5_82a3_47193d40b932.slice/crio-ffdee5db275c21f5afcebcefed33ca5f9f42c9ebf1fa01e904d7a4daa0c46255 WatchSource:0}: Error finding container ffdee5db275c21f5afcebcefed33ca5f9f42c9ebf1fa01e904d7a4daa0c46255: Status 404 returned error can't find the container with id ffdee5db275c21f5afcebcefed33ca5f9f42c9ebf1fa01e904d7a4daa0c46255 Dec 17 09:10:43 crc kubenswrapper[4935]: I1217 09:10:43.434533 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-55vfx"] Dec 17 09:10:43 crc kubenswrapper[4935]: W1217 09:10:43.447596 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fe737c1_a407_4da3_a492_a49c892b1db9.slice/crio-2178099edd193ca84205077b7bda5e615ec1f968db801c71462e6391eb6e569d WatchSource:0}: Error finding container 2178099edd193ca84205077b7bda5e615ec1f968db801c71462e6391eb6e569d: Status 404 returned error can't find the container with id 2178099edd193ca84205077b7bda5e615ec1f968db801c71462e6391eb6e569d Dec 17 09:10:43 crc kubenswrapper[4935]: I1217 09:10:43.701687 4935 generic.go:334] "Generic (PLEG): container finished" podID="2fe737c1-a407-4da3-a492-a49c892b1db9" containerID="c40760293784ddf8ba29849618222bd185ac70e8643cc2b0e897838be528a9e8" exitCode=0 Dec 17 09:10:43 crc kubenswrapper[4935]: I1217 09:10:43.702284 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55vfx" event={"ID":"2fe737c1-a407-4da3-a492-a49c892b1db9","Type":"ContainerDied","Data":"c40760293784ddf8ba29849618222bd185ac70e8643cc2b0e897838be528a9e8"} Dec 17 09:10:43 crc kubenswrapper[4935]: I1217 09:10:43.702615 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55vfx" event={"ID":"2fe737c1-a407-4da3-a492-a49c892b1db9","Type":"ContainerStarted","Data":"2178099edd193ca84205077b7bda5e615ec1f968db801c71462e6391eb6e569d"} Dec 17 09:10:43 crc kubenswrapper[4935]: I1217 09:10:43.708171 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" event={"ID":"043d4fa9-df38-43d5-82a3-47193d40b932","Type":"ContainerStarted","Data":"d619e4b12b35a3f41d8ba2611792e75eb96019974b1280258138221a647368cd"} Dec 17 09:10:43 crc kubenswrapper[4935]: I1217 09:10:43.708227 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" event={"ID":"043d4fa9-df38-43d5-82a3-47193d40b932","Type":"ContainerStarted","Data":"ffdee5db275c21f5afcebcefed33ca5f9f42c9ebf1fa01e904d7a4daa0c46255"} Dec 17 09:10:43 crc kubenswrapper[4935]: I1217 09:10:43.708336 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:10:43 crc kubenswrapper[4935]: I1217 09:10:43.744151 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" podStartSLOduration=1.7441265449999999 podStartE2EDuration="1.744126545s" podCreationTimestamp="2025-12-17 09:10:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:10:43.739144669 +0000 UTC m=+363.398985442" watchObservedRunningTime="2025-12-17 09:10:43.744126545 +0000 UTC m=+363.403967308" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.049269 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wvpv9"] Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.050727 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.052993 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.062722 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvpv9"] Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.164248 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/618e1775-1752-4cf2-b98a-562046bdc3f6-catalog-content\") pod \"redhat-marketplace-wvpv9\" (UID: \"618e1775-1752-4cf2-b98a-562046bdc3f6\") " pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.164361 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/618e1775-1752-4cf2-b98a-562046bdc3f6-utilities\") pod \"redhat-marketplace-wvpv9\" (UID: \"618e1775-1752-4cf2-b98a-562046bdc3f6\") " pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.164417 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k72fw\" (UniqueName: \"kubernetes.io/projected/618e1775-1752-4cf2-b98a-562046bdc3f6-kube-api-access-k72fw\") pod \"redhat-marketplace-wvpv9\" (UID: \"618e1775-1752-4cf2-b98a-562046bdc3f6\") " pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.265722 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/618e1775-1752-4cf2-b98a-562046bdc3f6-utilities\") pod \"redhat-marketplace-wvpv9\" (UID: \"618e1775-1752-4cf2-b98a-562046bdc3f6\") " pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.265789 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k72fw\" (UniqueName: \"kubernetes.io/projected/618e1775-1752-4cf2-b98a-562046bdc3f6-kube-api-access-k72fw\") pod \"redhat-marketplace-wvpv9\" (UID: \"618e1775-1752-4cf2-b98a-562046bdc3f6\") " pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.265877 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/618e1775-1752-4cf2-b98a-562046bdc3f6-catalog-content\") pod \"redhat-marketplace-wvpv9\" (UID: \"618e1775-1752-4cf2-b98a-562046bdc3f6\") " pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.266580 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/618e1775-1752-4cf2-b98a-562046bdc3f6-utilities\") pod \"redhat-marketplace-wvpv9\" (UID: \"618e1775-1752-4cf2-b98a-562046bdc3f6\") " pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.266705 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/618e1775-1752-4cf2-b98a-562046bdc3f6-catalog-content\") pod \"redhat-marketplace-wvpv9\" (UID: \"618e1775-1752-4cf2-b98a-562046bdc3f6\") " pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.288387 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k72fw\" (UniqueName: \"kubernetes.io/projected/618e1775-1752-4cf2-b98a-562046bdc3f6-kube-api-access-k72fw\") pod \"redhat-marketplace-wvpv9\" (UID: \"618e1775-1752-4cf2-b98a-562046bdc3f6\") " pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.371130 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.712046 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55vfx" event={"ID":"2fe737c1-a407-4da3-a492-a49c892b1db9","Type":"ContainerStarted","Data":"5306adc92bc3bff9f2cdc80d14c0612f570fbc7aafa40bfec0a8fa2cac874dbb"} Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.714439 4935 generic.go:334] "Generic (PLEG): container finished" podID="13e7e638-f648-4b5f-9589-9258a45be193" containerID="dc72f84994bf700ddcec82398630568858f71b804142ed16ccb0752b13b6f4b5" exitCode=0 Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.714495 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9kxf" event={"ID":"13e7e638-f648-4b5f-9589-9258a45be193","Type":"ContainerDied","Data":"dc72f84994bf700ddcec82398630568858f71b804142ed16ccb0752b13b6f4b5"} Dec 17 09:10:44 crc kubenswrapper[4935]: I1217 09:10:44.829200 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wvpv9"] Dec 17 09:10:44 crc kubenswrapper[4935]: W1217 09:10:44.844678 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod618e1775_1752_4cf2_b98a_562046bdc3f6.slice/crio-970b1827687c89c38e502466882c881c05db382a6e79ac24b37488dfd1cf8671 WatchSource:0}: Error finding container 970b1827687c89c38e502466882c881c05db382a6e79ac24b37488dfd1cf8671: Status 404 returned error can't find the container with id 970b1827687c89c38e502466882c881c05db382a6e79ac24b37488dfd1cf8671 Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.051369 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zpgjx"] Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.052489 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.059605 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.063881 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zpgjx"] Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.179379 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjth\" (UniqueName: \"kubernetes.io/projected/6e4d8134-256e-4ca3-adb5-7dc5beaf48ac-kube-api-access-5cjth\") pod \"certified-operators-zpgjx\" (UID: \"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac\") " pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.179453 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e4d8134-256e-4ca3-adb5-7dc5beaf48ac-catalog-content\") pod \"certified-operators-zpgjx\" (UID: \"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac\") " pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.179489 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e4d8134-256e-4ca3-adb5-7dc5beaf48ac-utilities\") pod \"certified-operators-zpgjx\" (UID: \"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac\") " pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.280581 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjth\" (UniqueName: \"kubernetes.io/projected/6e4d8134-256e-4ca3-adb5-7dc5beaf48ac-kube-api-access-5cjth\") pod \"certified-operators-zpgjx\" (UID: \"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac\") " pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.283413 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e4d8134-256e-4ca3-adb5-7dc5beaf48ac-catalog-content\") pod \"certified-operators-zpgjx\" (UID: \"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac\") " pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.283584 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e4d8134-256e-4ca3-adb5-7dc5beaf48ac-utilities\") pod \"certified-operators-zpgjx\" (UID: \"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac\") " pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.284231 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e4d8134-256e-4ca3-adb5-7dc5beaf48ac-utilities\") pod \"certified-operators-zpgjx\" (UID: \"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac\") " pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.284674 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e4d8134-256e-4ca3-adb5-7dc5beaf48ac-catalog-content\") pod \"certified-operators-zpgjx\" (UID: \"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac\") " pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.306576 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjth\" (UniqueName: \"kubernetes.io/projected/6e4d8134-256e-4ca3-adb5-7dc5beaf48ac-kube-api-access-5cjth\") pod \"certified-operators-zpgjx\" (UID: \"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac\") " pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.383737 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.725533 4935 generic.go:334] "Generic (PLEG): container finished" podID="2fe737c1-a407-4da3-a492-a49c892b1db9" containerID="5306adc92bc3bff9f2cdc80d14c0612f570fbc7aafa40bfec0a8fa2cac874dbb" exitCode=0 Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.725609 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55vfx" event={"ID":"2fe737c1-a407-4da3-a492-a49c892b1db9","Type":"ContainerDied","Data":"5306adc92bc3bff9f2cdc80d14c0612f570fbc7aafa40bfec0a8fa2cac874dbb"} Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.728926 4935 generic.go:334] "Generic (PLEG): container finished" podID="618e1775-1752-4cf2-b98a-562046bdc3f6" containerID="36eb7f1afad27f6f7a1a95fb75175a6f72271afe9d8b7e8f4721634238a49117" exitCode=0 Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.729022 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvpv9" event={"ID":"618e1775-1752-4cf2-b98a-562046bdc3f6","Type":"ContainerDied","Data":"36eb7f1afad27f6f7a1a95fb75175a6f72271afe9d8b7e8f4721634238a49117"} Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.729057 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvpv9" event={"ID":"618e1775-1752-4cf2-b98a-562046bdc3f6","Type":"ContainerStarted","Data":"970b1827687c89c38e502466882c881c05db382a6e79ac24b37488dfd1cf8671"} Dec 17 09:10:45 crc kubenswrapper[4935]: I1217 09:10:45.791709 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zpgjx"] Dec 17 09:10:45 crc kubenswrapper[4935]: W1217 09:10:45.796545 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e4d8134_256e_4ca3_adb5_7dc5beaf48ac.slice/crio-d85d2705f1a1d5673221d1249597b28a495f061c50f2ff2bb19a30eead36e406 WatchSource:0}: Error finding container d85d2705f1a1d5673221d1249597b28a495f061c50f2ff2bb19a30eead36e406: Status 404 returned error can't find the container with id d85d2705f1a1d5673221d1249597b28a495f061c50f2ff2bb19a30eead36e406 Dec 17 09:10:46 crc kubenswrapper[4935]: I1217 09:10:46.737895 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-55vfx" event={"ID":"2fe737c1-a407-4da3-a492-a49c892b1db9","Type":"ContainerStarted","Data":"f6e291eda9e3e0749360615918307aa1d9c98a159e892c66fdeae558abc8abc4"} Dec 17 09:10:46 crc kubenswrapper[4935]: I1217 09:10:46.740727 4935 generic.go:334] "Generic (PLEG): container finished" podID="618e1775-1752-4cf2-b98a-562046bdc3f6" containerID="5cf14fcad54a01a9fe6de5306950f16ddc91bfd1e27d3e2041ec2492a6f9cb29" exitCode=0 Dec 17 09:10:46 crc kubenswrapper[4935]: I1217 09:10:46.740814 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvpv9" event={"ID":"618e1775-1752-4cf2-b98a-562046bdc3f6","Type":"ContainerDied","Data":"5cf14fcad54a01a9fe6de5306950f16ddc91bfd1e27d3e2041ec2492a6f9cb29"} Dec 17 09:10:46 crc kubenswrapper[4935]: I1217 09:10:46.745196 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s9kxf" event={"ID":"13e7e638-f648-4b5f-9589-9258a45be193","Type":"ContainerStarted","Data":"12ba0577faae55eced61f6e914d5085f15c36376dc783406777a59fb926347b4"} Dec 17 09:10:46 crc kubenswrapper[4935]: I1217 09:10:46.746516 4935 generic.go:334] "Generic (PLEG): container finished" podID="6e4d8134-256e-4ca3-adb5-7dc5beaf48ac" containerID="70cc600587992cfe5aaf4aa5f8033c4cde970db578caf0e50c7bd65241e7235c" exitCode=0 Dec 17 09:10:46 crc kubenswrapper[4935]: I1217 09:10:46.746542 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpgjx" event={"ID":"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac","Type":"ContainerDied","Data":"70cc600587992cfe5aaf4aa5f8033c4cde970db578caf0e50c7bd65241e7235c"} Dec 17 09:10:46 crc kubenswrapper[4935]: I1217 09:10:46.746576 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpgjx" event={"ID":"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac","Type":"ContainerStarted","Data":"d85d2705f1a1d5673221d1249597b28a495f061c50f2ff2bb19a30eead36e406"} Dec 17 09:10:46 crc kubenswrapper[4935]: I1217 09:10:46.778951 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-55vfx" podStartSLOduration=2.333758391 podStartE2EDuration="4.778933847s" podCreationTimestamp="2025-12-17 09:10:42 +0000 UTC" firstStartedPulling="2025-12-17 09:10:43.703914308 +0000 UTC m=+363.363755071" lastFinishedPulling="2025-12-17 09:10:46.149089764 +0000 UTC m=+365.808930527" observedRunningTime="2025-12-17 09:10:46.762058191 +0000 UTC m=+366.421898954" watchObservedRunningTime="2025-12-17 09:10:46.778933847 +0000 UTC m=+366.438774610" Dec 17 09:10:46 crc kubenswrapper[4935]: I1217 09:10:46.798209 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s9kxf" podStartSLOduration=2.871495635 podStartE2EDuration="5.798183714s" podCreationTimestamp="2025-12-17 09:10:41 +0000 UTC" firstStartedPulling="2025-12-17 09:10:42.695992937 +0000 UTC m=+362.355833700" lastFinishedPulling="2025-12-17 09:10:45.622681016 +0000 UTC m=+365.282521779" observedRunningTime="2025-12-17 09:10:46.796310087 +0000 UTC m=+366.456150850" watchObservedRunningTime="2025-12-17 09:10:46.798183714 +0000 UTC m=+366.458024477" Dec 17 09:10:47 crc kubenswrapper[4935]: I1217 09:10:47.755678 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wvpv9" event={"ID":"618e1775-1752-4cf2-b98a-562046bdc3f6","Type":"ContainerStarted","Data":"b5f72ef14a081db15e1e0080dd739b814c07a5ec53aff98cd4c5aa1a29ad5216"} Dec 17 09:10:47 crc kubenswrapper[4935]: I1217 09:10:47.759892 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpgjx" event={"ID":"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac","Type":"ContainerStarted","Data":"244d9f287d19faf9902a95264e3709e368036227ca0be688364cccdadb1bf674"} Dec 17 09:10:47 crc kubenswrapper[4935]: I1217 09:10:47.811737 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wvpv9" podStartSLOduration=2.128377332 podStartE2EDuration="3.811709337s" podCreationTimestamp="2025-12-17 09:10:44 +0000 UTC" firstStartedPulling="2025-12-17 09:10:45.732446822 +0000 UTC m=+365.392287585" lastFinishedPulling="2025-12-17 09:10:47.415778827 +0000 UTC m=+367.075619590" observedRunningTime="2025-12-17 09:10:47.77863225 +0000 UTC m=+367.438473113" watchObservedRunningTime="2025-12-17 09:10:47.811709337 +0000 UTC m=+367.471550100" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.037321 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6488c7567-ltv2h"] Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.037620 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" podUID="121b9af2-90da-4a68-9858-7980c0807055" containerName="controller-manager" containerID="cri-o://54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5" gracePeriod=30 Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.557689 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.631831 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6hbp\" (UniqueName: \"kubernetes.io/projected/121b9af2-90da-4a68-9858-7980c0807055-kube-api-access-n6hbp\") pod \"121b9af2-90da-4a68-9858-7980c0807055\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.632460 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-proxy-ca-bundles\") pod \"121b9af2-90da-4a68-9858-7980c0807055\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.632488 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-config\") pod \"121b9af2-90da-4a68-9858-7980c0807055\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.632571 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-client-ca\") pod \"121b9af2-90da-4a68-9858-7980c0807055\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.632637 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121b9af2-90da-4a68-9858-7980c0807055-serving-cert\") pod \"121b9af2-90da-4a68-9858-7980c0807055\" (UID: \"121b9af2-90da-4a68-9858-7980c0807055\") " Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.633527 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "121b9af2-90da-4a68-9858-7980c0807055" (UID: "121b9af2-90da-4a68-9858-7980c0807055"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.633611 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-config" (OuterVolumeSpecName: "config") pod "121b9af2-90da-4a68-9858-7980c0807055" (UID: "121b9af2-90da-4a68-9858-7980c0807055"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.633802 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-client-ca" (OuterVolumeSpecName: "client-ca") pod "121b9af2-90da-4a68-9858-7980c0807055" (UID: "121b9af2-90da-4a68-9858-7980c0807055"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.642378 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/121b9af2-90da-4a68-9858-7980c0807055-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "121b9af2-90da-4a68-9858-7980c0807055" (UID: "121b9af2-90da-4a68-9858-7980c0807055"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.642681 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/121b9af2-90da-4a68-9858-7980c0807055-kube-api-access-n6hbp" (OuterVolumeSpecName: "kube-api-access-n6hbp") pod "121b9af2-90da-4a68-9858-7980c0807055" (UID: "121b9af2-90da-4a68-9858-7980c0807055"). InnerVolumeSpecName "kube-api-access-n6hbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.734516 4935 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-client-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.734556 4935 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/121b9af2-90da-4a68-9858-7980c0807055-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.734567 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6hbp\" (UniqueName: \"kubernetes.io/projected/121b9af2-90da-4a68-9858-7980c0807055-kube-api-access-n6hbp\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.734579 4935 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.734589 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/121b9af2-90da-4a68-9858-7980c0807055-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.766347 4935 generic.go:334] "Generic (PLEG): container finished" podID="121b9af2-90da-4a68-9858-7980c0807055" containerID="54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5" exitCode=0 Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.766436 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" event={"ID":"121b9af2-90da-4a68-9858-7980c0807055","Type":"ContainerDied","Data":"54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5"} Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.766472 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" event={"ID":"121b9af2-90da-4a68-9858-7980c0807055","Type":"ContainerDied","Data":"2aa7a24e588f6919e4fcd0bad8ff3c713c62a123b6f8f9c1e5951406df0b9bd1"} Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.766491 4935 scope.go:117] "RemoveContainer" containerID="54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.766613 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6488c7567-ltv2h" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.771335 4935 generic.go:334] "Generic (PLEG): container finished" podID="6e4d8134-256e-4ca3-adb5-7dc5beaf48ac" containerID="244d9f287d19faf9902a95264e3709e368036227ca0be688364cccdadb1bf674" exitCode=0 Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.771457 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpgjx" event={"ID":"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac","Type":"ContainerDied","Data":"244d9f287d19faf9902a95264e3709e368036227ca0be688364cccdadb1bf674"} Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.802004 4935 scope.go:117] "RemoveContainer" containerID="54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5" Dec 17 09:10:48 crc kubenswrapper[4935]: E1217 09:10:48.802761 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5\": container with ID starting with 54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5 not found: ID does not exist" containerID="54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.802924 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5"} err="failed to get container status \"54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5\": rpc error: code = NotFound desc = could not find container \"54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5\": container with ID starting with 54d9c044297cdeca46b045b4887bd944e3deae5d300662b154ed49e85dc708f5 not found: ID does not exist" Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.815241 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6488c7567-ltv2h"] Dec 17 09:10:48 crc kubenswrapper[4935]: I1217 09:10:48.819603 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6488c7567-ltv2h"] Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.131767 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="121b9af2-90da-4a68-9858-7980c0807055" path="/var/lib/kubelet/pods/121b9af2-90da-4a68-9858-7980c0807055/volumes" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.257073 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc"] Dec 17 09:10:49 crc kubenswrapper[4935]: E1217 09:10:49.257353 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="121b9af2-90da-4a68-9858-7980c0807055" containerName="controller-manager" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.257366 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="121b9af2-90da-4a68-9858-7980c0807055" containerName="controller-manager" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.257466 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="121b9af2-90da-4a68-9858-7980c0807055" containerName="controller-manager" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.257873 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.260522 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.260894 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.261036 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.262234 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.262375 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.262449 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.269381 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.271448 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc"] Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.343549 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef6a7545-793a-4bd7-9193-c695004c0a63-serving-cert\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.343595 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swx8s\" (UniqueName: \"kubernetes.io/projected/ef6a7545-793a-4bd7-9193-c695004c0a63-kube-api-access-swx8s\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.343684 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef6a7545-793a-4bd7-9193-c695004c0a63-client-ca\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.343847 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6a7545-793a-4bd7-9193-c695004c0a63-config\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.343978 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef6a7545-793a-4bd7-9193-c695004c0a63-proxy-ca-bundles\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.445378 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef6a7545-793a-4bd7-9193-c695004c0a63-serving-cert\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.445957 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swx8s\" (UniqueName: \"kubernetes.io/projected/ef6a7545-793a-4bd7-9193-c695004c0a63-kube-api-access-swx8s\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.446034 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef6a7545-793a-4bd7-9193-c695004c0a63-client-ca\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.446064 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6a7545-793a-4bd7-9193-c695004c0a63-config\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.446098 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef6a7545-793a-4bd7-9193-c695004c0a63-proxy-ca-bundles\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.447035 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ef6a7545-793a-4bd7-9193-c695004c0a63-client-ca\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.447833 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ef6a7545-793a-4bd7-9193-c695004c0a63-proxy-ca-bundles\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.450315 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef6a7545-793a-4bd7-9193-c695004c0a63-config\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.455108 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef6a7545-793a-4bd7-9193-c695004c0a63-serving-cert\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:49 crc kubenswrapper[4935]: I1217 09:10:49.464689 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swx8s\" (UniqueName: \"kubernetes.io/projected/ef6a7545-793a-4bd7-9193-c695004c0a63-kube-api-access-swx8s\") pod \"controller-manager-7c7c4c4bf5-zbqcc\" (UID: \"ef6a7545-793a-4bd7-9193-c695004c0a63\") " pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:50 crc kubenswrapper[4935]: I1217 09:10:50.591413 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:50 crc kubenswrapper[4935]: I1217 09:10:50.789417 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zpgjx" event={"ID":"6e4d8134-256e-4ca3-adb5-7dc5beaf48ac","Type":"ContainerStarted","Data":"a02500bc8eb15fbd36575fc74299daf1a68d453985a4566104eafa9900236fc8"} Dec 17 09:10:50 crc kubenswrapper[4935]: I1217 09:10:50.808832 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc"] Dec 17 09:10:50 crc kubenswrapper[4935]: I1217 09:10:50.816987 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zpgjx" podStartSLOduration=2.65443434 podStartE2EDuration="5.816973392s" podCreationTimestamp="2025-12-17 09:10:45 +0000 UTC" firstStartedPulling="2025-12-17 09:10:46.747764829 +0000 UTC m=+366.407605592" lastFinishedPulling="2025-12-17 09:10:49.910303891 +0000 UTC m=+369.570144644" observedRunningTime="2025-12-17 09:10:50.813540186 +0000 UTC m=+370.473380949" watchObservedRunningTime="2025-12-17 09:10:50.816973392 +0000 UTC m=+370.476814155" Dec 17 09:10:50 crc kubenswrapper[4935]: W1217 09:10:50.818914 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef6a7545_793a_4bd7_9193_c695004c0a63.slice/crio-3e879bbaffe42a09d2d66da8cce823ec50437a7d73609870b187be9b6a08e257 WatchSource:0}: Error finding container 3e879bbaffe42a09d2d66da8cce823ec50437a7d73609870b187be9b6a08e257: Status 404 returned error can't find the container with id 3e879bbaffe42a09d2d66da8cce823ec50437a7d73609870b187be9b6a08e257 Dec 17 09:10:51 crc kubenswrapper[4935]: I1217 09:10:51.797304 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" event={"ID":"ef6a7545-793a-4bd7-9193-c695004c0a63","Type":"ContainerStarted","Data":"ccfae3606dc86d6ce2d6e28fdc02d650b8d1d3736e603d87c059de50aca66602"} Dec 17 09:10:51 crc kubenswrapper[4935]: I1217 09:10:51.797693 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" event={"ID":"ef6a7545-793a-4bd7-9193-c695004c0a63","Type":"ContainerStarted","Data":"3e879bbaffe42a09d2d66da8cce823ec50437a7d73609870b187be9b6a08e257"} Dec 17 09:10:51 crc kubenswrapper[4935]: I1217 09:10:51.978231 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:51 crc kubenswrapper[4935]: I1217 09:10:51.978297 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:52 crc kubenswrapper[4935]: I1217 09:10:52.029197 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:52 crc kubenswrapper[4935]: I1217 09:10:52.055582 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" podStartSLOduration=4.055561795 podStartE2EDuration="4.055561795s" podCreationTimestamp="2025-12-17 09:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:10:51.817087027 +0000 UTC m=+371.476927790" watchObservedRunningTime="2025-12-17 09:10:52.055561795 +0000 UTC m=+371.715402558" Dec 17 09:10:52 crc kubenswrapper[4935]: I1217 09:10:52.802925 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:52 crc kubenswrapper[4935]: I1217 09:10:52.808197 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c7c4c4bf5-zbqcc" Dec 17 09:10:52 crc kubenswrapper[4935]: I1217 09:10:52.856564 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s9kxf" Dec 17 09:10:52 crc kubenswrapper[4935]: I1217 09:10:52.970426 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:52 crc kubenswrapper[4935]: I1217 09:10:52.971421 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:53 crc kubenswrapper[4935]: I1217 09:10:53.011034 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:53 crc kubenswrapper[4935]: I1217 09:10:53.863201 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-55vfx" Dec 17 09:10:54 crc kubenswrapper[4935]: I1217 09:10:54.372136 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:54 crc kubenswrapper[4935]: I1217 09:10:54.372227 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:54 crc kubenswrapper[4935]: I1217 09:10:54.428481 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:54 crc kubenswrapper[4935]: I1217 09:10:54.861486 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wvpv9" Dec 17 09:10:55 crc kubenswrapper[4935]: I1217 09:10:55.384745 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:55 crc kubenswrapper[4935]: I1217 09:10:55.385132 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:55 crc kubenswrapper[4935]: I1217 09:10:55.430934 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:10:55 crc kubenswrapper[4935]: I1217 09:10:55.859952 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zpgjx" Dec 17 09:11:00 crc kubenswrapper[4935]: I1217 09:11:00.130806 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:11:00 crc kubenswrapper[4935]: I1217 09:11:00.131225 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:11:02 crc kubenswrapper[4935]: I1217 09:11:02.927664 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-67jgk" Dec 17 09:11:02 crc kubenswrapper[4935]: I1217 09:11:02.987389 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g8v79"] Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.039489 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" podUID="2aea1606-ff6f-4325-9f92-c83e2c5079c0" containerName="registry" containerID="cri-o://c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab" gracePeriod=30 Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.482688 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.564633 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-tls\") pod \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.564750 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-bound-sa-token\") pod \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.564794 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2aea1606-ff6f-4325-9f92-c83e2c5079c0-installation-pull-secrets\") pod \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.564815 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-certificates\") pod \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.564854 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2aea1606-ff6f-4325-9f92-c83e2c5079c0-ca-trust-extracted\") pod \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.565812 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.565905 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-trusted-ca\") pod \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.565956 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dggwp\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-kube-api-access-dggwp\") pod \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\" (UID: \"2aea1606-ff6f-4325-9f92-c83e2c5079c0\") " Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.566431 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2aea1606-ff6f-4325-9f92-c83e2c5079c0" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.566855 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2aea1606-ff6f-4325-9f92-c83e2c5079c0" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.567335 4935 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.567374 4935 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.573894 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2aea1606-ff6f-4325-9f92-c83e2c5079c0" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.574739 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2aea1606-ff6f-4325-9f92-c83e2c5079c0" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.574971 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-kube-api-access-dggwp" (OuterVolumeSpecName: "kube-api-access-dggwp") pod "2aea1606-ff6f-4325-9f92-c83e2c5079c0" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0"). InnerVolumeSpecName "kube-api-access-dggwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.575760 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aea1606-ff6f-4325-9f92-c83e2c5079c0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2aea1606-ff6f-4325-9f92-c83e2c5079c0" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.579904 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2aea1606-ff6f-4325-9f92-c83e2c5079c0" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.588048 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aea1606-ff6f-4325-9f92-c83e2c5079c0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2aea1606-ff6f-4325-9f92-c83e2c5079c0" (UID: "2aea1606-ff6f-4325-9f92-c83e2c5079c0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.669252 4935 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.669324 4935 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2aea1606-ff6f-4325-9f92-c83e2c5079c0-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.669341 4935 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2aea1606-ff6f-4325-9f92-c83e2c5079c0-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.669351 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dggwp\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-kube-api-access-dggwp\") on node \"crc\" DevicePath \"\"" Dec 17 09:11:28 crc kubenswrapper[4935]: I1217 09:11:28.669363 4935 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2aea1606-ff6f-4325-9f92-c83e2c5079c0-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:11:29 crc kubenswrapper[4935]: I1217 09:11:29.056263 4935 generic.go:334] "Generic (PLEG): container finished" podID="2aea1606-ff6f-4325-9f92-c83e2c5079c0" containerID="c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab" exitCode=0 Dec 17 09:11:29 crc kubenswrapper[4935]: I1217 09:11:29.056369 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" event={"ID":"2aea1606-ff6f-4325-9f92-c83e2c5079c0","Type":"ContainerDied","Data":"c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab"} Dec 17 09:11:29 crc kubenswrapper[4935]: I1217 09:11:29.056421 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" event={"ID":"2aea1606-ff6f-4325-9f92-c83e2c5079c0","Type":"ContainerDied","Data":"30cb1279a3fee019e342ba7482e208590983ff2d677c74b287d186f6d76ff0cb"} Dec 17 09:11:29 crc kubenswrapper[4935]: I1217 09:11:29.056454 4935 scope.go:117] "RemoveContainer" containerID="c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab" Dec 17 09:11:29 crc kubenswrapper[4935]: I1217 09:11:29.056371 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-g8v79" Dec 17 09:11:29 crc kubenswrapper[4935]: I1217 09:11:29.081811 4935 scope.go:117] "RemoveContainer" containerID="c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab" Dec 17 09:11:29 crc kubenswrapper[4935]: E1217 09:11:29.082569 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab\": container with ID starting with c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab not found: ID does not exist" containerID="c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab" Dec 17 09:11:29 crc kubenswrapper[4935]: I1217 09:11:29.082629 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab"} err="failed to get container status \"c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab\": rpc error: code = NotFound desc = could not find container \"c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab\": container with ID starting with c88092d8154d2fd984348e57433171897684dbb6f52dd2066355895e885e27ab not found: ID does not exist" Dec 17 09:11:29 crc kubenswrapper[4935]: I1217 09:11:29.105147 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g8v79"] Dec 17 09:11:29 crc kubenswrapper[4935]: I1217 09:11:29.108635 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-g8v79"] Dec 17 09:11:29 crc kubenswrapper[4935]: I1217 09:11:29.137097 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aea1606-ff6f-4325-9f92-c83e2c5079c0" path="/var/lib/kubelet/pods/2aea1606-ff6f-4325-9f92-c83e2c5079c0/volumes" Dec 17 09:11:30 crc kubenswrapper[4935]: I1217 09:11:30.131127 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:11:30 crc kubenswrapper[4935]: I1217 09:11:30.131221 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:12:00 crc kubenswrapper[4935]: I1217 09:12:00.130543 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:12:00 crc kubenswrapper[4935]: I1217 09:12:00.131233 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:12:00 crc kubenswrapper[4935]: I1217 09:12:00.131302 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:12:00 crc kubenswrapper[4935]: I1217 09:12:00.131919 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e5e3223631fb199364a4d543a8278a4bff01bf27d2ba3883d6f2f33ee501b687"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:12:00 crc kubenswrapper[4935]: I1217 09:12:00.131976 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://e5e3223631fb199364a4d543a8278a4bff01bf27d2ba3883d6f2f33ee501b687" gracePeriod=600 Dec 17 09:12:00 crc kubenswrapper[4935]: I1217 09:12:00.276398 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="e5e3223631fb199364a4d543a8278a4bff01bf27d2ba3883d6f2f33ee501b687" exitCode=0 Dec 17 09:12:00 crc kubenswrapper[4935]: I1217 09:12:00.276763 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"e5e3223631fb199364a4d543a8278a4bff01bf27d2ba3883d6f2f33ee501b687"} Dec 17 09:12:00 crc kubenswrapper[4935]: I1217 09:12:00.276963 4935 scope.go:117] "RemoveContainer" containerID="8e85381092eaa64add8dae8ab91314b2dfb7c3be9538be99baa39112b947bfb8" Dec 17 09:12:01 crc kubenswrapper[4935]: I1217 09:12:01.285480 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"c0b861ed431cd38a53635da47f417dba930fef91f14b1f917781fd9245bf2b9b"} Dec 17 09:14:00 crc kubenswrapper[4935]: I1217 09:14:00.130914 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:14:00 crc kubenswrapper[4935]: I1217 09:14:00.131826 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:14:30 crc kubenswrapper[4935]: I1217 09:14:30.131484 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:14:30 crc kubenswrapper[4935]: I1217 09:14:30.132295 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.131448 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.132118 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.132176 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.133016 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0b861ed431cd38a53635da47f417dba930fef91f14b1f917781fd9245bf2b9b"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.133111 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://c0b861ed431cd38a53635da47f417dba930fef91f14b1f917781fd9245bf2b9b" gracePeriod=600 Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.184946 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5"] Dec 17 09:15:00 crc kubenswrapper[4935]: E1217 09:15:00.185240 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aea1606-ff6f-4325-9f92-c83e2c5079c0" containerName="registry" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.185261 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aea1606-ff6f-4325-9f92-c83e2c5079c0" containerName="registry" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.185376 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aea1606-ff6f-4325-9f92-c83e2c5079c0" containerName="registry" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.185806 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.189026 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.190436 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.200312 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5"] Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.214368 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9411393b-db02-442f-a053-05760712be53-secret-volume\") pod \"collect-profiles-29432715-p95b5\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.214463 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9411393b-db02-442f-a053-05760712be53-config-volume\") pod \"collect-profiles-29432715-p95b5\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.214497 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hlhh\" (UniqueName: \"kubernetes.io/projected/9411393b-db02-442f-a053-05760712be53-kube-api-access-6hlhh\") pod \"collect-profiles-29432715-p95b5\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.310671 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="c0b861ed431cd38a53635da47f417dba930fef91f14b1f917781fd9245bf2b9b" exitCode=0 Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.310926 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"c0b861ed431cd38a53635da47f417dba930fef91f14b1f917781fd9245bf2b9b"} Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.310999 4935 scope.go:117] "RemoveContainer" containerID="e5e3223631fb199364a4d543a8278a4bff01bf27d2ba3883d6f2f33ee501b687" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.315137 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9411393b-db02-442f-a053-05760712be53-config-volume\") pod \"collect-profiles-29432715-p95b5\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.315266 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hlhh\" (UniqueName: \"kubernetes.io/projected/9411393b-db02-442f-a053-05760712be53-kube-api-access-6hlhh\") pod \"collect-profiles-29432715-p95b5\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.315741 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9411393b-db02-442f-a053-05760712be53-secret-volume\") pod \"collect-profiles-29432715-p95b5\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.316438 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9411393b-db02-442f-a053-05760712be53-config-volume\") pod \"collect-profiles-29432715-p95b5\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.322369 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9411393b-db02-442f-a053-05760712be53-secret-volume\") pod \"collect-profiles-29432715-p95b5\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.332577 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hlhh\" (UniqueName: \"kubernetes.io/projected/9411393b-db02-442f-a053-05760712be53-kube-api-access-6hlhh\") pod \"collect-profiles-29432715-p95b5\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.506553 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:00 crc kubenswrapper[4935]: I1217 09:15:00.703646 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5"] Dec 17 09:15:01 crc kubenswrapper[4935]: I1217 09:15:01.319387 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"0452e123d683b48fb5da02863983a6b9a8cd0b2246d43611082e994f7f11e20b"} Dec 17 09:15:01 crc kubenswrapper[4935]: I1217 09:15:01.321210 4935 generic.go:334] "Generic (PLEG): container finished" podID="9411393b-db02-442f-a053-05760712be53" containerID="55228e4591950f0dbf1234ba423d05d343a7c3c6f7f65e081d916de0945a3440" exitCode=0 Dec 17 09:15:01 crc kubenswrapper[4935]: I1217 09:15:01.321267 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" event={"ID":"9411393b-db02-442f-a053-05760712be53","Type":"ContainerDied","Data":"55228e4591950f0dbf1234ba423d05d343a7c3c6f7f65e081d916de0945a3440"} Dec 17 09:15:01 crc kubenswrapper[4935]: I1217 09:15:01.321324 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" event={"ID":"9411393b-db02-442f-a053-05760712be53","Type":"ContainerStarted","Data":"9edbea5d519676aca8f06cbda090c48cc439910f510a73f8084dd19117b79b59"} Dec 17 09:15:02 crc kubenswrapper[4935]: I1217 09:15:02.539908 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:15:02 crc kubenswrapper[4935]: I1217 09:15:02.541185 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9411393b-db02-442f-a053-05760712be53-config-volume\") pod \"9411393b-db02-442f-a053-05760712be53\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " Dec 17 09:15:02 crc kubenswrapper[4935]: I1217 09:15:02.541249 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hlhh\" (UniqueName: \"kubernetes.io/projected/9411393b-db02-442f-a053-05760712be53-kube-api-access-6hlhh\") pod \"9411393b-db02-442f-a053-05760712be53\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " Dec 17 09:15:02 crc kubenswrapper[4935]: I1217 09:15:02.541331 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9411393b-db02-442f-a053-05760712be53-secret-volume\") pod \"9411393b-db02-442f-a053-05760712be53\" (UID: \"9411393b-db02-442f-a053-05760712be53\") " Dec 17 09:15:02 crc kubenswrapper[4935]: I1217 09:15:02.541852 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9411393b-db02-442f-a053-05760712be53-config-volume" (OuterVolumeSpecName: "config-volume") pod "9411393b-db02-442f-a053-05760712be53" (UID: "9411393b-db02-442f-a053-05760712be53"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:15:02 crc kubenswrapper[4935]: I1217 09:15:02.547303 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9411393b-db02-442f-a053-05760712be53-kube-api-access-6hlhh" (OuterVolumeSpecName: "kube-api-access-6hlhh") pod "9411393b-db02-442f-a053-05760712be53" (UID: "9411393b-db02-442f-a053-05760712be53"). InnerVolumeSpecName "kube-api-access-6hlhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:15:02 crc kubenswrapper[4935]: I1217 09:15:02.547327 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9411393b-db02-442f-a053-05760712be53-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9411393b-db02-442f-a053-05760712be53" (UID: "9411393b-db02-442f-a053-05760712be53"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:15:02 crc kubenswrapper[4935]: I1217 09:15:02.642518 4935 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9411393b-db02-442f-a053-05760712be53-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 17 09:15:02 crc kubenswrapper[4935]: I1217 09:15:02.642563 4935 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9411393b-db02-442f-a053-05760712be53-config-volume\") on node \"crc\" DevicePath \"\"" Dec 17 09:15:02 crc kubenswrapper[4935]: I1217 09:15:02.642579 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hlhh\" (UniqueName: \"kubernetes.io/projected/9411393b-db02-442f-a053-05760712be53-kube-api-access-6hlhh\") on node \"crc\" DevicePath \"\"" Dec 17 09:15:03 crc kubenswrapper[4935]: I1217 09:15:03.332366 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" event={"ID":"9411393b-db02-442f-a053-05760712be53","Type":"ContainerDied","Data":"9edbea5d519676aca8f06cbda090c48cc439910f510a73f8084dd19117b79b59"} Dec 17 09:15:03 crc kubenswrapper[4935]: I1217 09:15:03.332418 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9edbea5d519676aca8f06cbda090c48cc439910f510a73f8084dd19117b79b59" Dec 17 09:15:03 crc kubenswrapper[4935]: I1217 09:15:03.332447 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.037215 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wbl2l"] Dec 17 09:16:42 crc kubenswrapper[4935]: E1217 09:16:42.038299 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9411393b-db02-442f-a053-05760712be53" containerName="collect-profiles" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.038317 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="9411393b-db02-442f-a053-05760712be53" containerName="collect-profiles" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.038435 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="9411393b-db02-442f-a053-05760712be53" containerName="collect-profiles" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.038864 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wbl2l" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.043541 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.043594 4935 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-mhhhp" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.043658 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.046256 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-r966z"] Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.047459 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-r966z" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.049636 4935 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-slp5l" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.062018 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wbl2l"] Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.065410 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-r966z"] Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.068288 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5hx6v"] Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.069123 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-5hx6v" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.075305 4935 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jcnkk" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.080377 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5hx6v"] Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.193450 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc8dh\" (UniqueName: \"kubernetes.io/projected/ece85099-fa51-490d-a498-cf35ec83a8ad-kube-api-access-qc8dh\") pod \"cert-manager-5b446d88c5-r966z\" (UID: \"ece85099-fa51-490d-a498-cf35ec83a8ad\") " pod="cert-manager/cert-manager-5b446d88c5-r966z" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.193554 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvbb6\" (UniqueName: \"kubernetes.io/projected/be7a439b-efa4-4fdd-b56d-5e8d53f2ceb1-kube-api-access-vvbb6\") pod \"cert-manager-webhook-5655c58dd6-5hx6v\" (UID: \"be7a439b-efa4-4fdd-b56d-5e8d53f2ceb1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5hx6v" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.193577 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmhkg\" (UniqueName: \"kubernetes.io/projected/4d701ce2-c118-46f4-904b-5294c782ce68-kube-api-access-gmhkg\") pod \"cert-manager-cainjector-7f985d654d-wbl2l\" (UID: \"4d701ce2-c118-46f4-904b-5294c782ce68\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wbl2l" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.295164 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc8dh\" (UniqueName: \"kubernetes.io/projected/ece85099-fa51-490d-a498-cf35ec83a8ad-kube-api-access-qc8dh\") pod \"cert-manager-5b446d88c5-r966z\" (UID: \"ece85099-fa51-490d-a498-cf35ec83a8ad\") " pod="cert-manager/cert-manager-5b446d88c5-r966z" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.295320 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvbb6\" (UniqueName: \"kubernetes.io/projected/be7a439b-efa4-4fdd-b56d-5e8d53f2ceb1-kube-api-access-vvbb6\") pod \"cert-manager-webhook-5655c58dd6-5hx6v\" (UID: \"be7a439b-efa4-4fdd-b56d-5e8d53f2ceb1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5hx6v" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.295369 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmhkg\" (UniqueName: \"kubernetes.io/projected/4d701ce2-c118-46f4-904b-5294c782ce68-kube-api-access-gmhkg\") pod \"cert-manager-cainjector-7f985d654d-wbl2l\" (UID: \"4d701ce2-c118-46f4-904b-5294c782ce68\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wbl2l" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.317118 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvbb6\" (UniqueName: \"kubernetes.io/projected/be7a439b-efa4-4fdd-b56d-5e8d53f2ceb1-kube-api-access-vvbb6\") pod \"cert-manager-webhook-5655c58dd6-5hx6v\" (UID: \"be7a439b-efa4-4fdd-b56d-5e8d53f2ceb1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5hx6v" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.322447 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc8dh\" (UniqueName: \"kubernetes.io/projected/ece85099-fa51-490d-a498-cf35ec83a8ad-kube-api-access-qc8dh\") pod \"cert-manager-5b446d88c5-r966z\" (UID: \"ece85099-fa51-490d-a498-cf35ec83a8ad\") " pod="cert-manager/cert-manager-5b446d88c5-r966z" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.325057 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmhkg\" (UniqueName: \"kubernetes.io/projected/4d701ce2-c118-46f4-904b-5294c782ce68-kube-api-access-gmhkg\") pod \"cert-manager-cainjector-7f985d654d-wbl2l\" (UID: \"4d701ce2-c118-46f4-904b-5294c782ce68\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wbl2l" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.361142 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wbl2l" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.371934 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-r966z" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.392192 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-5hx6v" Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.594618 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wbl2l"] Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.605677 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.855501 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5hx6v"] Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.860467 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-r966z"] Dec 17 09:16:42 crc kubenswrapper[4935]: W1217 09:16:42.862743 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podece85099_fa51_490d_a498_cf35ec83a8ad.slice/crio-191ab4804f28d7a20cc1fc2efe4b5f1e01f05c2d34ba369bb0f84c60478c74b9 WatchSource:0}: Error finding container 191ab4804f28d7a20cc1fc2efe4b5f1e01f05c2d34ba369bb0f84c60478c74b9: Status 404 returned error can't find the container with id 191ab4804f28d7a20cc1fc2efe4b5f1e01f05c2d34ba369bb0f84c60478c74b9 Dec 17 09:16:42 crc kubenswrapper[4935]: W1217 09:16:42.864529 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe7a439b_efa4_4fdd_b56d_5e8d53f2ceb1.slice/crio-fc1d19c8e03fdc75834a4742449d6274aa101cefe9821744201345044be695cf WatchSource:0}: Error finding container fc1d19c8e03fdc75834a4742449d6274aa101cefe9821744201345044be695cf: Status 404 returned error can't find the container with id fc1d19c8e03fdc75834a4742449d6274aa101cefe9821744201345044be695cf Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.942224 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-5hx6v" event={"ID":"be7a439b-efa4-4fdd-b56d-5e8d53f2ceb1","Type":"ContainerStarted","Data":"fc1d19c8e03fdc75834a4742449d6274aa101cefe9821744201345044be695cf"} Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.943302 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-r966z" event={"ID":"ece85099-fa51-490d-a498-cf35ec83a8ad","Type":"ContainerStarted","Data":"191ab4804f28d7a20cc1fc2efe4b5f1e01f05c2d34ba369bb0f84c60478c74b9"} Dec 17 09:16:42 crc kubenswrapper[4935]: I1217 09:16:42.944215 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wbl2l" event={"ID":"4d701ce2-c118-46f4-904b-5294c782ce68","Type":"ContainerStarted","Data":"7e60835e4eb6e013ca68068a2f62b9756dd49c29374264ff9c7a918d8f8757fe"} Dec 17 09:16:46 crc kubenswrapper[4935]: I1217 09:16:46.974198 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wbl2l" event={"ID":"4d701ce2-c118-46f4-904b-5294c782ce68","Type":"ContainerStarted","Data":"c0029ad40d34fd3d74f4741f66f8b4ce3620285848b222b6e2e8e22dbdfc523b"} Dec 17 09:16:46 crc kubenswrapper[4935]: I1217 09:16:46.976519 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-5hx6v" event={"ID":"be7a439b-efa4-4fdd-b56d-5e8d53f2ceb1","Type":"ContainerStarted","Data":"e821027d4e536fa13a70587fc6d668e5e7aca8bd22b7bf559857f1f0cd963308"} Dec 17 09:16:46 crc kubenswrapper[4935]: I1217 09:16:46.977045 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-5hx6v" Dec 17 09:16:46 crc kubenswrapper[4935]: I1217 09:16:46.980139 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-r966z" event={"ID":"ece85099-fa51-490d-a498-cf35ec83a8ad","Type":"ContainerStarted","Data":"b8c12599eefe39af29920981dfad7bbe6d6b6fa4fb9a940571df54c86449516c"} Dec 17 09:16:47 crc kubenswrapper[4935]: I1217 09:16:47.004573 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-wbl2l" podStartSLOduration=1.674013833 podStartE2EDuration="5.004544102s" podCreationTimestamp="2025-12-17 09:16:42 +0000 UTC" firstStartedPulling="2025-12-17 09:16:42.6053495 +0000 UTC m=+722.265190263" lastFinishedPulling="2025-12-17 09:16:45.935879769 +0000 UTC m=+725.595720532" observedRunningTime="2025-12-17 09:16:46.997626594 +0000 UTC m=+726.657467387" watchObservedRunningTime="2025-12-17 09:16:47.004544102 +0000 UTC m=+726.664384895" Dec 17 09:16:47 crc kubenswrapper[4935]: I1217 09:16:47.017461 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-5hx6v" podStartSLOduration=1.875500379 podStartE2EDuration="5.017414217s" podCreationTimestamp="2025-12-17 09:16:42 +0000 UTC" firstStartedPulling="2025-12-17 09:16:42.867875743 +0000 UTC m=+722.527716506" lastFinishedPulling="2025-12-17 09:16:46.009789581 +0000 UTC m=+725.669630344" observedRunningTime="2025-12-17 09:16:47.016094565 +0000 UTC m=+726.675935358" watchObservedRunningTime="2025-12-17 09:16:47.017414217 +0000 UTC m=+726.677255020" Dec 17 09:16:47 crc kubenswrapper[4935]: I1217 09:16:47.037586 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-r966z" podStartSLOduration=1.965234377 podStartE2EDuration="5.037558018s" podCreationTimestamp="2025-12-17 09:16:42 +0000 UTC" firstStartedPulling="2025-12-17 09:16:42.865173157 +0000 UTC m=+722.525013920" lastFinishedPulling="2025-12-17 09:16:45.937496758 +0000 UTC m=+725.597337561" observedRunningTime="2025-12-17 09:16:47.034748289 +0000 UTC m=+726.694589052" watchObservedRunningTime="2025-12-17 09:16:47.037558018 +0000 UTC m=+726.697398791" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.396690 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-5hx6v" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.532587 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rwwd4"] Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.533068 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovn-controller" containerID="cri-o://ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779" gracePeriod=30 Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.533198 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302" gracePeriod=30 Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.533237 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="kube-rbac-proxy-node" containerID="cri-o://38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1" gracePeriod=30 Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.533441 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="nbdb" containerID="cri-o://a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94" gracePeriod=30 Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.533436 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="sbdb" containerID="cri-o://18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3" gracePeriod=30 Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.533331 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovn-acl-logging" containerID="cri-o://2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3" gracePeriod=30 Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.533591 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="northd" containerID="cri-o://fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666" gracePeriod=30 Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.576315 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" containerID="cri-o://fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330" gracePeriod=30 Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.870410 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/3.log" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.872726 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovn-acl-logging/0.log" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.873405 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovn-controller/0.log" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.873938 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934393 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-f28dc"] Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934708 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934727 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934743 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934752 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934764 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="sbdb" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934774 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="sbdb" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934784 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934792 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934804 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934811 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934822 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="northd" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934829 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="northd" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934843 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="kube-rbac-proxy-node" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934851 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="kube-rbac-proxy-node" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934867 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="nbdb" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934876 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="nbdb" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934890 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovn-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934903 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovn-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934918 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovn-acl-logging" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934928 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovn-acl-logging" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934945 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="kubecfg-setup" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934955 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="kubecfg-setup" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.934969 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="kube-rbac-proxy-ovn-metrics" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.934984 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="kube-rbac-proxy-ovn-metrics" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935152 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="nbdb" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935175 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935189 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovn-acl-logging" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935202 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935211 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="northd" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935222 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="kube-rbac-proxy-ovn-metrics" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935237 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="sbdb" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935247 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935257 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="kube-rbac-proxy-node" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935268 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovn-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: E1217 09:16:52.935454 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935464 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935597 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.935610 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerName="ovnkube-controller" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.937788 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.945929 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovn-node-metrics-cert\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.945974 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-openvswitch\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.945996 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-slash\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946025 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-ovn\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946047 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-script-lib\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946078 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-systemd-units\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946235 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-run-ovn-kubernetes\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946261 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-var-lib-openvswitch\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946316 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-log-socket\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946334 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-kubelet\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946350 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-run-ovn\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946370 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946399 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-cni-netd\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946422 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0001f83-34e8-4b76-bd88-76712eacf85f-ovnkube-script-lib\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946442 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0001f83-34e8-4b76-bd88-76712eacf85f-ovnkube-config\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946462 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0001f83-34e8-4b76-bd88-76712eacf85f-ovn-node-metrics-cert\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946478 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0001f83-34e8-4b76-bd88-76712eacf85f-env-overrides\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946497 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-systemd-units\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946515 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-etc-openvswitch\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946545 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-run-netns\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946560 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-run-systemd\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946585 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rl2n\" (UniqueName: \"kubernetes.io/projected/f0001f83-34e8-4b76-bd88-76712eacf85f-kube-api-access-7rl2n\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946608 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-slash\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946630 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-cni-bin\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946647 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-node-log\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946666 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-run-openvswitch\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946842 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946870 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946890 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-slash" (OuterVolumeSpecName: "host-slash") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.946909 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.947466 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:16:52 crc kubenswrapper[4935]: I1217 09:16:52.955128 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.037323 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovnkube-controller/3.log" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.046219 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovn-acl-logging/0.log" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047638 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-netns\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047681 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rwwd4_969f53bb-09fc-4577-8f7c-dc6ca1679add/ovn-controller/0.log" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047697 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-node-log\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047736 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ftrx\" (UniqueName: \"kubernetes.io/projected/969f53bb-09fc-4577-8f7c-dc6ca1679add-kube-api-access-8ftrx\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047756 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-systemd\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047772 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-log-socket\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047765 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047849 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047870 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-node-log" (OuterVolumeSpecName: "node-log") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047889 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-log-socket" (OuterVolumeSpecName: "log-socket") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047791 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-var-lib-openvswitch\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.047983 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-env-overrides\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048025 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-etc-openvswitch\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048061 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-netd\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048092 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-kubelet\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048120 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-config\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048140 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-ovn-kubernetes\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048194 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048206 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-var-lib-cni-networks-ovn-kubernetes\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048339 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-bin\") pod \"969f53bb-09fc-4577-8f7c-dc6ca1679add\" (UID: \"969f53bb-09fc-4577-8f7c-dc6ca1679add\") " Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048530 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-cni-netd\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048569 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0001f83-34e8-4b76-bd88-76712eacf85f-ovnkube-script-lib\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048593 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0001f83-34e8-4b76-bd88-76712eacf85f-ovnkube-config\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048633 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0001f83-34e8-4b76-bd88-76712eacf85f-ovn-node-metrics-cert\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048652 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0001f83-34e8-4b76-bd88-76712eacf85f-env-overrides\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048675 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-systemd-units\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048682 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048696 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-etc-openvswitch\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048722 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048764 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-etc-openvswitch\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048771 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-run-netns\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048801 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048803 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-run-systemd\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048823 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-run-systemd\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048871 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rl2n\" (UniqueName: \"kubernetes.io/projected/f0001f83-34e8-4b76-bd88-76712eacf85f-kube-api-access-7rl2n\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048879 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-run-netns\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048913 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-slash\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048943 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-cni-bin\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048962 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-node-log\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.048984 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-run-openvswitch\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049002 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-run-ovn-kubernetes\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049027 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-var-lib-openvswitch\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049039 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330" exitCode=0 Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049065 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-log-socket\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049086 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-kubelet\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049084 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049101 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-run-ovn\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049127 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049146 4935 scope.go:117] "RemoveContainer" containerID="fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049211 4935 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049223 4935 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-node-log\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049233 4935 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-log-socket\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049243 4935 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049287 4935 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049298 4935 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049306 4935 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049316 4935 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049326 4935 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049335 4935 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049344 4935 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-slash\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049355 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049354 4935 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049374 4935 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049383 4935 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049129 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049625 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-slash\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049658 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049683 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049706 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049735 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-cni-netd\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050451 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f0001f83-34e8-4b76-bd88-76712eacf85f-env-overrides\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050545 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f0001f83-34e8-4b76-bd88-76712eacf85f-ovnkube-script-lib\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050585 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-systemd-units\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050802 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-var-lib-openvswitch\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050847 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-kubelet\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050890 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-log-socket\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050897 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-cni-bin\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.049067 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3" exitCode=0 Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050946 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94" exitCode=0 Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050975 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666" exitCode=0 Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050985 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302" exitCode=0 Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.050997 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1" exitCode=0 Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051008 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3" exitCode=143 Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051022 4935 generic.go:334] "Generic (PLEG): container finished" podID="969f53bb-09fc-4577-8f7c-dc6ca1679add" containerID="ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779" exitCode=143 Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051072 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f0001f83-34e8-4b76-bd88-76712eacf85f-ovnkube-config\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051114 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-run-ovn\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051142 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051169 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-run-openvswitch\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051190 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-host-run-ovn-kubernetes\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051196 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f0001f83-34e8-4b76-bd88-76712eacf85f-node-log\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051213 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051310 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051342 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051362 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051382 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051398 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051409 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051418 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051425 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051434 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051442 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051450 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051458 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051469 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051482 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051491 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051500 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051507 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051518 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051527 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051535 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051543 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051552 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051559 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051569 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051573 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051581 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051607 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051620 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051626 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051632 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051638 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051643 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051649 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051655 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051661 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051674 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rwwd4" event={"ID":"969f53bb-09fc-4577-8f7c-dc6ca1679add","Type":"ContainerDied","Data":"d587b0881dce447b95d2989ed7f67a3b0cb238463a31ed8b3830fc424d506094"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051690 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051697 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051703 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051709 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051715 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051721 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051726 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051732 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051738 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.051743 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.058230 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969f53bb-09fc-4577-8f7c-dc6ca1679add-kube-api-access-8ftrx" (OuterVolumeSpecName: "kube-api-access-8ftrx") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "kube-api-access-8ftrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.060204 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f0001f83-34e8-4b76-bd88-76712eacf85f-ovn-node-metrics-cert\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.067734 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/2.log" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.068472 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "969f53bb-09fc-4577-8f7c-dc6ca1679add" (UID: "969f53bb-09fc-4577-8f7c-dc6ca1679add"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.068777 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/1.log" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.068830 4935 generic.go:334] "Generic (PLEG): container finished" podID="8b52811a-aff2-43c1-9074-f0654f991d9c" containerID="11369d6fada4674292fe86adc7a89a8519d4860f1afdfbc6ee9bb6e4a1e3d22e" exitCode=2 Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.068892 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jrmtf" event={"ID":"8b52811a-aff2-43c1-9074-f0654f991d9c","Type":"ContainerDied","Data":"11369d6fada4674292fe86adc7a89a8519d4860f1afdfbc6ee9bb6e4a1e3d22e"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.068922 4935 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a"} Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.069574 4935 scope.go:117] "RemoveContainer" containerID="11369d6fada4674292fe86adc7a89a8519d4860f1afdfbc6ee9bb6e4a1e3d22e" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.081128 4935 scope.go:117] "RemoveContainer" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.085302 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rl2n\" (UniqueName: \"kubernetes.io/projected/f0001f83-34e8-4b76-bd88-76712eacf85f-kube-api-access-7rl2n\") pod \"ovnkube-node-f28dc\" (UID: \"f0001f83-34e8-4b76-bd88-76712eacf85f\") " pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.103490 4935 scope.go:117] "RemoveContainer" containerID="18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.118238 4935 scope.go:117] "RemoveContainer" containerID="a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.133968 4935 scope.go:117] "RemoveContainer" containerID="fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.149715 4935 scope.go:117] "RemoveContainer" containerID="7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.151475 4935 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/969f53bb-09fc-4577-8f7c-dc6ca1679add-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.151511 4935 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.151527 4935 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.151539 4935 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.151553 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ftrx\" (UniqueName: \"kubernetes.io/projected/969f53bb-09fc-4577-8f7c-dc6ca1679add-kube-api-access-8ftrx\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.151563 4935 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/969f53bb-09fc-4577-8f7c-dc6ca1679add-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.170538 4935 scope.go:117] "RemoveContainer" containerID="38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.183826 4935 scope.go:117] "RemoveContainer" containerID="2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.197144 4935 scope.go:117] "RemoveContainer" containerID="ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.220478 4935 scope.go:117] "RemoveContainer" containerID="c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.234363 4935 scope.go:117] "RemoveContainer" containerID="fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330" Dec 17 09:16:53 crc kubenswrapper[4935]: E1217 09:16:53.234821 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": container with ID starting with fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330 not found: ID does not exist" containerID="fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.234872 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330"} err="failed to get container status \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": rpc error: code = NotFound desc = could not find container \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": container with ID starting with fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.234903 4935 scope.go:117] "RemoveContainer" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:16:53 crc kubenswrapper[4935]: E1217 09:16:53.235248 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\": container with ID starting with b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319 not found: ID does not exist" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.235369 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319"} err="failed to get container status \"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\": rpc error: code = NotFound desc = could not find container \"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\": container with ID starting with b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.235468 4935 scope.go:117] "RemoveContainer" containerID="18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3" Dec 17 09:16:53 crc kubenswrapper[4935]: E1217 09:16:53.235932 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\": container with ID starting with 18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3 not found: ID does not exist" containerID="18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.235964 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3"} err="failed to get container status \"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\": rpc error: code = NotFound desc = could not find container \"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\": container with ID starting with 18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.235983 4935 scope.go:117] "RemoveContainer" containerID="a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94" Dec 17 09:16:53 crc kubenswrapper[4935]: E1217 09:16:53.236224 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\": container with ID starting with a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94 not found: ID does not exist" containerID="a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.236253 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94"} err="failed to get container status \"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\": rpc error: code = NotFound desc = could not find container \"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\": container with ID starting with a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.236295 4935 scope.go:117] "RemoveContainer" containerID="fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666" Dec 17 09:16:53 crc kubenswrapper[4935]: E1217 09:16:53.236621 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\": container with ID starting with fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666 not found: ID does not exist" containerID="fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.236727 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666"} err="failed to get container status \"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\": rpc error: code = NotFound desc = could not find container \"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\": container with ID starting with fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.236822 4935 scope.go:117] "RemoveContainer" containerID="7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302" Dec 17 09:16:53 crc kubenswrapper[4935]: E1217 09:16:53.237134 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\": container with ID starting with 7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302 not found: ID does not exist" containerID="7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.237214 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302"} err="failed to get container status \"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\": rpc error: code = NotFound desc = could not find container \"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\": container with ID starting with 7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.237339 4935 scope.go:117] "RemoveContainer" containerID="38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1" Dec 17 09:16:53 crc kubenswrapper[4935]: E1217 09:16:53.237652 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\": container with ID starting with 38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1 not found: ID does not exist" containerID="38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.237679 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1"} err="failed to get container status \"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\": rpc error: code = NotFound desc = could not find container \"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\": container with ID starting with 38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.237696 4935 scope.go:117] "RemoveContainer" containerID="2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3" Dec 17 09:16:53 crc kubenswrapper[4935]: E1217 09:16:53.237951 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\": container with ID starting with 2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3 not found: ID does not exist" containerID="2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.237995 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3"} err="failed to get container status \"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\": rpc error: code = NotFound desc = could not find container \"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\": container with ID starting with 2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.238014 4935 scope.go:117] "RemoveContainer" containerID="ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779" Dec 17 09:16:53 crc kubenswrapper[4935]: E1217 09:16:53.238326 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\": container with ID starting with ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779 not found: ID does not exist" containerID="ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.238358 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779"} err="failed to get container status \"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\": rpc error: code = NotFound desc = could not find container \"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\": container with ID starting with ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.238432 4935 scope.go:117] "RemoveContainer" containerID="c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f" Dec 17 09:16:53 crc kubenswrapper[4935]: E1217 09:16:53.238712 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\": container with ID starting with c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f not found: ID does not exist" containerID="c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.238745 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f"} err="failed to get container status \"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\": rpc error: code = NotFound desc = could not find container \"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\": container with ID starting with c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.238791 4935 scope.go:117] "RemoveContainer" containerID="fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.239110 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330"} err="failed to get container status \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": rpc error: code = NotFound desc = could not find container \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": container with ID starting with fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.239135 4935 scope.go:117] "RemoveContainer" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.239375 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319"} err="failed to get container status \"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\": rpc error: code = NotFound desc = could not find container \"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\": container with ID starting with b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.239399 4935 scope.go:117] "RemoveContainer" containerID="18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.239601 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3"} err="failed to get container status \"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\": rpc error: code = NotFound desc = could not find container \"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\": container with ID starting with 18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.239626 4935 scope.go:117] "RemoveContainer" containerID="a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.239941 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94"} err="failed to get container status \"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\": rpc error: code = NotFound desc = could not find container \"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\": container with ID starting with a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.239966 4935 scope.go:117] "RemoveContainer" containerID="fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.240198 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666"} err="failed to get container status \"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\": rpc error: code = NotFound desc = could not find container \"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\": container with ID starting with fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.240242 4935 scope.go:117] "RemoveContainer" containerID="7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.240501 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302"} err="failed to get container status \"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\": rpc error: code = NotFound desc = could not find container \"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\": container with ID starting with 7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.240522 4935 scope.go:117] "RemoveContainer" containerID="38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.240776 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1"} err="failed to get container status \"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\": rpc error: code = NotFound desc = could not find container \"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\": container with ID starting with 38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.240798 4935 scope.go:117] "RemoveContainer" containerID="2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.241070 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3"} err="failed to get container status \"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\": rpc error: code = NotFound desc = could not find container \"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\": container with ID starting with 2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.241191 4935 scope.go:117] "RemoveContainer" containerID="ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.241968 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779"} err="failed to get container status \"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\": rpc error: code = NotFound desc = could not find container \"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\": container with ID starting with ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.241997 4935 scope.go:117] "RemoveContainer" containerID="c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.242588 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f"} err="failed to get container status \"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\": rpc error: code = NotFound desc = could not find container \"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\": container with ID starting with c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.242640 4935 scope.go:117] "RemoveContainer" containerID="fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.242992 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330"} err="failed to get container status \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": rpc error: code = NotFound desc = could not find container \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": container with ID starting with fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.243015 4935 scope.go:117] "RemoveContainer" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.243762 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319"} err="failed to get container status \"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\": rpc error: code = NotFound desc = could not find container \"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\": container with ID starting with b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.243788 4935 scope.go:117] "RemoveContainer" containerID="18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.244536 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3"} err="failed to get container status \"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\": rpc error: code = NotFound desc = could not find container \"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\": container with ID starting with 18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.244585 4935 scope.go:117] "RemoveContainer" containerID="a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.245185 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94"} err="failed to get container status \"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\": rpc error: code = NotFound desc = could not find container \"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\": container with ID starting with a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.245306 4935 scope.go:117] "RemoveContainer" containerID="fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.245568 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666"} err="failed to get container status \"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\": rpc error: code = NotFound desc = could not find container \"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\": container with ID starting with fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.245613 4935 scope.go:117] "RemoveContainer" containerID="7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.246942 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302"} err="failed to get container status \"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\": rpc error: code = NotFound desc = could not find container \"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\": container with ID starting with 7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.246979 4935 scope.go:117] "RemoveContainer" containerID="38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.247402 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1"} err="failed to get container status \"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\": rpc error: code = NotFound desc = could not find container \"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\": container with ID starting with 38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.247429 4935 scope.go:117] "RemoveContainer" containerID="2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.247834 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3"} err="failed to get container status \"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\": rpc error: code = NotFound desc = could not find container \"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\": container with ID starting with 2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.247884 4935 scope.go:117] "RemoveContainer" containerID="ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.248476 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779"} err="failed to get container status \"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\": rpc error: code = NotFound desc = could not find container \"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\": container with ID starting with ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.248534 4935 scope.go:117] "RemoveContainer" containerID="c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.248950 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f"} err="failed to get container status \"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\": rpc error: code = NotFound desc = could not find container \"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\": container with ID starting with c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.248978 4935 scope.go:117] "RemoveContainer" containerID="fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.249341 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330"} err="failed to get container status \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": rpc error: code = NotFound desc = could not find container \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": container with ID starting with fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.249363 4935 scope.go:117] "RemoveContainer" containerID="b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.249664 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319"} err="failed to get container status \"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\": rpc error: code = NotFound desc = could not find container \"b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319\": container with ID starting with b3ac77145a8c72d804697a355791bf57fb66454a368d02cb3fa8c89268d7c319 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.249692 4935 scope.go:117] "RemoveContainer" containerID="18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.250011 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3"} err="failed to get container status \"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\": rpc error: code = NotFound desc = could not find container \"18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3\": container with ID starting with 18d10547f51f03817bdbdcabc67242ef5c5999069e4462808758d0fc18ca3ac3 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.250031 4935 scope.go:117] "RemoveContainer" containerID="a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.250351 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94"} err="failed to get container status \"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\": rpc error: code = NotFound desc = could not find container \"a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94\": container with ID starting with a007841ff89335c18734ce2c1b0c2a93c89ab83bfaf9bdfa0249d59e90acff94 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.250381 4935 scope.go:117] "RemoveContainer" containerID="fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.250671 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666"} err="failed to get container status \"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\": rpc error: code = NotFound desc = could not find container \"fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666\": container with ID starting with fd999ede3b9bf9a6cce21755e944bf770fedcdfc109ba37a7b2b8243500dc666 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.250701 4935 scope.go:117] "RemoveContainer" containerID="7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.251048 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302"} err="failed to get container status \"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\": rpc error: code = NotFound desc = could not find container \"7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302\": container with ID starting with 7cff5d43daec7608b5baf60e82ed72e3ce7eeb634a3c8e72dc18b43778173302 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.251070 4935 scope.go:117] "RemoveContainer" containerID="38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.251378 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1"} err="failed to get container status \"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\": rpc error: code = NotFound desc = could not find container \"38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1\": container with ID starting with 38477db448880a405b137be0acf771ecf81f66852463e6ca01770c63bb0628b1 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.251407 4935 scope.go:117] "RemoveContainer" containerID="2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.251689 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3"} err="failed to get container status \"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\": rpc error: code = NotFound desc = could not find container \"2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3\": container with ID starting with 2d803aba016e02da4b1c1e9a32945505c4c77231f2f13682d2d673c3460fb1c3 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.251714 4935 scope.go:117] "RemoveContainer" containerID="ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.252012 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779"} err="failed to get container status \"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\": rpc error: code = NotFound desc = could not find container \"ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779\": container with ID starting with ec639d2b7c5bfc960ce41e893100507ae3b2c595dc183d8d3563964d958ca779 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.252035 4935 scope.go:117] "RemoveContainer" containerID="c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.252296 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f"} err="failed to get container status \"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\": rpc error: code = NotFound desc = could not find container \"c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f\": container with ID starting with c411c2faa3a6516b94a6baf6eaee1f65bfe83f9f7976d7b43ea6ee6fdf79014f not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.252323 4935 scope.go:117] "RemoveContainer" containerID="fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.252617 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330"} err="failed to get container status \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": rpc error: code = NotFound desc = could not find container \"fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330\": container with ID starting with fbe28a0555b406dc63023f7da10481726083f2f55d456290c8c06d572317b330 not found: ID does not exist" Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.256888 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:16:53 crc kubenswrapper[4935]: W1217 09:16:53.273670 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0001f83_34e8_4b76_bd88_76712eacf85f.slice/crio-19978c84bd1e9fd283e39aad20b64dbe0e07ff345a80470313506cbb6cece2f2 WatchSource:0}: Error finding container 19978c84bd1e9fd283e39aad20b64dbe0e07ff345a80470313506cbb6cece2f2: Status 404 returned error can't find the container with id 19978c84bd1e9fd283e39aad20b64dbe0e07ff345a80470313506cbb6cece2f2 Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.383372 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rwwd4"] Dec 17 09:16:53 crc kubenswrapper[4935]: I1217 09:16:53.387340 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rwwd4"] Dec 17 09:16:54 crc kubenswrapper[4935]: I1217 09:16:54.082221 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/2.log" Dec 17 09:16:54 crc kubenswrapper[4935]: I1217 09:16:54.083739 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/1.log" Dec 17 09:16:54 crc kubenswrapper[4935]: I1217 09:16:54.083910 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jrmtf" event={"ID":"8b52811a-aff2-43c1-9074-f0654f991d9c","Type":"ContainerStarted","Data":"df59d20a431f72bb6271d9d642b498a6dfc3c9c5b6aeb830249a7a24a2f96209"} Dec 17 09:16:54 crc kubenswrapper[4935]: I1217 09:16:54.088934 4935 generic.go:334] "Generic (PLEG): container finished" podID="f0001f83-34e8-4b76-bd88-76712eacf85f" containerID="acd8cd3602c84b604301a0a0c8473fcf3fc00041882e6d09bfd17220c980676a" exitCode=0 Dec 17 09:16:54 crc kubenswrapper[4935]: I1217 09:16:54.089001 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" event={"ID":"f0001f83-34e8-4b76-bd88-76712eacf85f","Type":"ContainerDied","Data":"acd8cd3602c84b604301a0a0c8473fcf3fc00041882e6d09bfd17220c980676a"} Dec 17 09:16:54 crc kubenswrapper[4935]: I1217 09:16:54.089051 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" event={"ID":"f0001f83-34e8-4b76-bd88-76712eacf85f","Type":"ContainerStarted","Data":"19978c84bd1e9fd283e39aad20b64dbe0e07ff345a80470313506cbb6cece2f2"} Dec 17 09:16:55 crc kubenswrapper[4935]: I1217 09:16:55.101865 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" event={"ID":"f0001f83-34e8-4b76-bd88-76712eacf85f","Type":"ContainerStarted","Data":"6fe0d4bda508461ce400a1b32c2b97d672500922169c1b2e7cc66e6bb5285faf"} Dec 17 09:16:55 crc kubenswrapper[4935]: I1217 09:16:55.102382 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" event={"ID":"f0001f83-34e8-4b76-bd88-76712eacf85f","Type":"ContainerStarted","Data":"fa6260e8aff1e966351de5a40eaa2f5d4c31c94e9d2f566a0ac2fa07f7af61c0"} Dec 17 09:16:55 crc kubenswrapper[4935]: I1217 09:16:55.102397 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" event={"ID":"f0001f83-34e8-4b76-bd88-76712eacf85f","Type":"ContainerStarted","Data":"e899fd4bc4ed191f32ffeebe7329cf029c24c5d298895bad607b01871d10f6fd"} Dec 17 09:16:55 crc kubenswrapper[4935]: I1217 09:16:55.102410 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" event={"ID":"f0001f83-34e8-4b76-bd88-76712eacf85f","Type":"ContainerStarted","Data":"b2cdb66235f1fcbe3ce27933344e0ac93c0ca277d44805f267e941820c94454c"} Dec 17 09:16:55 crc kubenswrapper[4935]: I1217 09:16:55.102422 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" event={"ID":"f0001f83-34e8-4b76-bd88-76712eacf85f","Type":"ContainerStarted","Data":"cad708f59d3b22f46dfdf1521df93a85232bbc496e1aea04d17c0cc27cbb491f"} Dec 17 09:16:55 crc kubenswrapper[4935]: I1217 09:16:55.102432 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" event={"ID":"f0001f83-34e8-4b76-bd88-76712eacf85f","Type":"ContainerStarted","Data":"1995da4b67c199ef730ad88c087ffb595697c6e08cc1cbe0c255ac23db52ac45"} Dec 17 09:16:55 crc kubenswrapper[4935]: I1217 09:16:55.131946 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="969f53bb-09fc-4577-8f7c-dc6ca1679add" path="/var/lib/kubelet/pods/969f53bb-09fc-4577-8f7c-dc6ca1679add/volumes" Dec 17 09:16:57 crc kubenswrapper[4935]: I1217 09:16:57.136778 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" event={"ID":"f0001f83-34e8-4b76-bd88-76712eacf85f","Type":"ContainerStarted","Data":"3bfe23c7bb8b8d6e1bd07f1800a0e3ae0598bdccbac90aef55ce42408ee754d9"} Dec 17 09:17:00 crc kubenswrapper[4935]: I1217 09:17:00.130769 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:17:00 crc kubenswrapper[4935]: I1217 09:17:00.131144 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:17:00 crc kubenswrapper[4935]: I1217 09:17:00.159601 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" event={"ID":"f0001f83-34e8-4b76-bd88-76712eacf85f","Type":"ContainerStarted","Data":"11df8d635bfd21e7935755885bbf7abacf9ea55db26c4a2dead0f75e65826a63"} Dec 17 09:17:00 crc kubenswrapper[4935]: I1217 09:17:00.160031 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:17:00 crc kubenswrapper[4935]: I1217 09:17:00.160222 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:17:00 crc kubenswrapper[4935]: I1217 09:17:00.160263 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:17:00 crc kubenswrapper[4935]: I1217 09:17:00.196038 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" podStartSLOduration=8.19601112 podStartE2EDuration="8.19601112s" podCreationTimestamp="2025-12-17 09:16:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:17:00.190747682 +0000 UTC m=+739.850588465" watchObservedRunningTime="2025-12-17 09:17:00.19601112 +0000 UTC m=+739.855851893" Dec 17 09:17:00 crc kubenswrapper[4935]: I1217 09:17:00.198550 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:17:00 crc kubenswrapper[4935]: I1217 09:17:00.198630 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:17:16 crc kubenswrapper[4935]: I1217 09:17:16.299641 4935 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 17 09:17:23 crc kubenswrapper[4935]: I1217 09:17:23.277581 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f28dc" Dec 17 09:17:30 crc kubenswrapper[4935]: I1217 09:17:30.130455 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:17:30 crc kubenswrapper[4935]: I1217 09:17:30.130972 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.083284 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df"] Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.084785 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.087316 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.099917 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df"] Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.182477 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.182566 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.182630 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf6fh\" (UniqueName: \"kubernetes.io/projected/d50ec877-316c-4993-9906-7830748759d7-kube-api-access-cf6fh\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.284222 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.284353 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.284423 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf6fh\" (UniqueName: \"kubernetes.io/projected/d50ec877-316c-4993-9906-7830748759d7-kube-api-access-cf6fh\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.285680 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-bundle\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.285713 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-util\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.315020 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf6fh\" (UniqueName: \"kubernetes.io/projected/d50ec877-316c-4993-9906-7830748759d7-kube-api-access-cf6fh\") pod \"98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.407909 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:33 crc kubenswrapper[4935]: I1217 09:17:33.606942 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df"] Dec 17 09:17:34 crc kubenswrapper[4935]: I1217 09:17:34.362732 4935 generic.go:334] "Generic (PLEG): container finished" podID="d50ec877-316c-4993-9906-7830748759d7" containerID="7b5e46ee74f9576023ebe59e0f5cab042a77c7b50fde48885cde38113a048d2b" exitCode=0 Dec 17 09:17:34 crc kubenswrapper[4935]: I1217 09:17:34.362849 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" event={"ID":"d50ec877-316c-4993-9906-7830748759d7","Type":"ContainerDied","Data":"7b5e46ee74f9576023ebe59e0f5cab042a77c7b50fde48885cde38113a048d2b"} Dec 17 09:17:34 crc kubenswrapper[4935]: I1217 09:17:34.363429 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" event={"ID":"d50ec877-316c-4993-9906-7830748759d7","Type":"ContainerStarted","Data":"fb82b178ed981f6bfef4af63590f10861b572a3dc83fcc6e51750a3503868616"} Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.413801 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-95pv6"] Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.415181 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.439067 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95pv6"] Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.517221 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-utilities\") pod \"redhat-operators-95pv6\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.517312 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtm5w\" (UniqueName: \"kubernetes.io/projected/116128fe-0957-43c8-ad33-060397b889b5-kube-api-access-vtm5w\") pod \"redhat-operators-95pv6\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.517351 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-catalog-content\") pod \"redhat-operators-95pv6\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.618431 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-utilities\") pod \"redhat-operators-95pv6\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.618491 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtm5w\" (UniqueName: \"kubernetes.io/projected/116128fe-0957-43c8-ad33-060397b889b5-kube-api-access-vtm5w\") pod \"redhat-operators-95pv6\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.618512 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-catalog-content\") pod \"redhat-operators-95pv6\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.619047 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-catalog-content\") pod \"redhat-operators-95pv6\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.619159 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-utilities\") pod \"redhat-operators-95pv6\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.643653 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtm5w\" (UniqueName: \"kubernetes.io/projected/116128fe-0957-43c8-ad33-060397b889b5-kube-api-access-vtm5w\") pod \"redhat-operators-95pv6\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.742585 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:35 crc kubenswrapper[4935]: I1217 09:17:35.974518 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-95pv6"] Dec 17 09:17:36 crc kubenswrapper[4935]: I1217 09:17:36.379377 4935 generic.go:334] "Generic (PLEG): container finished" podID="d50ec877-316c-4993-9906-7830748759d7" containerID="569f21df8f9efe808c3bdff4b6e80e986bf29e4ce17a1c0aeab27911db97fea4" exitCode=0 Dec 17 09:17:36 crc kubenswrapper[4935]: I1217 09:17:36.379477 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" event={"ID":"d50ec877-316c-4993-9906-7830748759d7","Type":"ContainerDied","Data":"569f21df8f9efe808c3bdff4b6e80e986bf29e4ce17a1c0aeab27911db97fea4"} Dec 17 09:17:36 crc kubenswrapper[4935]: I1217 09:17:36.381885 4935 generic.go:334] "Generic (PLEG): container finished" podID="116128fe-0957-43c8-ad33-060397b889b5" containerID="1d60009ee24981e33c50064404686c5f30449f8b56517d823f9e1ef6548979f8" exitCode=0 Dec 17 09:17:36 crc kubenswrapper[4935]: I1217 09:17:36.381966 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95pv6" event={"ID":"116128fe-0957-43c8-ad33-060397b889b5","Type":"ContainerDied","Data":"1d60009ee24981e33c50064404686c5f30449f8b56517d823f9e1ef6548979f8"} Dec 17 09:17:36 crc kubenswrapper[4935]: I1217 09:17:36.381984 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95pv6" event={"ID":"116128fe-0957-43c8-ad33-060397b889b5","Type":"ContainerStarted","Data":"083d4dfbfaccfc672def4232247832e7c5ce2118fde1cfff635eb616fff41b81"} Dec 17 09:17:37 crc kubenswrapper[4935]: I1217 09:17:37.389724 4935 generic.go:334] "Generic (PLEG): container finished" podID="d50ec877-316c-4993-9906-7830748759d7" containerID="e961da39cc1650340075e5fbcb736c22562e5e00bec084c717fc46cca8a5201d" exitCode=0 Dec 17 09:17:37 crc kubenswrapper[4935]: I1217 09:17:37.389875 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" event={"ID":"d50ec877-316c-4993-9906-7830748759d7","Type":"ContainerDied","Data":"e961da39cc1650340075e5fbcb736c22562e5e00bec084c717fc46cca8a5201d"} Dec 17 09:17:37 crc kubenswrapper[4935]: I1217 09:17:37.392911 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95pv6" event={"ID":"116128fe-0957-43c8-ad33-060397b889b5","Type":"ContainerStarted","Data":"35109039b48a399df9b6c1083b198f7d1ad68aa0b960be46bf10a29637b13d66"} Dec 17 09:17:38 crc kubenswrapper[4935]: I1217 09:17:38.818979 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:38 crc kubenswrapper[4935]: I1217 09:17:38.961410 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-bundle\") pod \"d50ec877-316c-4993-9906-7830748759d7\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " Dec 17 09:17:38 crc kubenswrapper[4935]: I1217 09:17:38.961467 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-util\") pod \"d50ec877-316c-4993-9906-7830748759d7\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " Dec 17 09:17:38 crc kubenswrapper[4935]: I1217 09:17:38.961603 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf6fh\" (UniqueName: \"kubernetes.io/projected/d50ec877-316c-4993-9906-7830748759d7-kube-api-access-cf6fh\") pod \"d50ec877-316c-4993-9906-7830748759d7\" (UID: \"d50ec877-316c-4993-9906-7830748759d7\") " Dec 17 09:17:38 crc kubenswrapper[4935]: I1217 09:17:38.962082 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-bundle" (OuterVolumeSpecName: "bundle") pod "d50ec877-316c-4993-9906-7830748759d7" (UID: "d50ec877-316c-4993-9906-7830748759d7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:17:38 crc kubenswrapper[4935]: I1217 09:17:38.966739 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d50ec877-316c-4993-9906-7830748759d7-kube-api-access-cf6fh" (OuterVolumeSpecName: "kube-api-access-cf6fh") pod "d50ec877-316c-4993-9906-7830748759d7" (UID: "d50ec877-316c-4993-9906-7830748759d7"). InnerVolumeSpecName "kube-api-access-cf6fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:17:38 crc kubenswrapper[4935]: I1217 09:17:38.977509 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-util" (OuterVolumeSpecName: "util") pod "d50ec877-316c-4993-9906-7830748759d7" (UID: "d50ec877-316c-4993-9906-7830748759d7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:17:39 crc kubenswrapper[4935]: I1217 09:17:39.063372 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf6fh\" (UniqueName: \"kubernetes.io/projected/d50ec877-316c-4993-9906-7830748759d7-kube-api-access-cf6fh\") on node \"crc\" DevicePath \"\"" Dec 17 09:17:39 crc kubenswrapper[4935]: I1217 09:17:39.063602 4935 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:17:39 crc kubenswrapper[4935]: I1217 09:17:39.063657 4935 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d50ec877-316c-4993-9906-7830748759d7-util\") on node \"crc\" DevicePath \"\"" Dec 17 09:17:39 crc kubenswrapper[4935]: I1217 09:17:39.409130 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" Dec 17 09:17:39 crc kubenswrapper[4935]: I1217 09:17:39.409126 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df" event={"ID":"d50ec877-316c-4993-9906-7830748759d7","Type":"ContainerDied","Data":"fb82b178ed981f6bfef4af63590f10861b572a3dc83fcc6e51750a3503868616"} Dec 17 09:17:39 crc kubenswrapper[4935]: I1217 09:17:39.409592 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb82b178ed981f6bfef4af63590f10861b572a3dc83fcc6e51750a3503868616" Dec 17 09:17:39 crc kubenswrapper[4935]: I1217 09:17:39.411321 4935 generic.go:334] "Generic (PLEG): container finished" podID="116128fe-0957-43c8-ad33-060397b889b5" containerID="35109039b48a399df9b6c1083b198f7d1ad68aa0b960be46bf10a29637b13d66" exitCode=0 Dec 17 09:17:39 crc kubenswrapper[4935]: I1217 09:17:39.411396 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95pv6" event={"ID":"116128fe-0957-43c8-ad33-060397b889b5","Type":"ContainerDied","Data":"35109039b48a399df9b6c1083b198f7d1ad68aa0b960be46bf10a29637b13d66"} Dec 17 09:17:40 crc kubenswrapper[4935]: I1217 09:17:40.419129 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95pv6" event={"ID":"116128fe-0957-43c8-ad33-060397b889b5","Type":"ContainerStarted","Data":"fefbbe211f2be329bc3c5af0bfa260b56af48ff9ea4707fd43bd18d20ccd9ef2"} Dec 17 09:17:40 crc kubenswrapper[4935]: I1217 09:17:40.441741 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-95pv6" podStartSLOduration=1.926228975 podStartE2EDuration="5.441710953s" podCreationTimestamp="2025-12-17 09:17:35 +0000 UTC" firstStartedPulling="2025-12-17 09:17:36.384257758 +0000 UTC m=+776.044098521" lastFinishedPulling="2025-12-17 09:17:39.899739736 +0000 UTC m=+779.559580499" observedRunningTime="2025-12-17 09:17:40.437344806 +0000 UTC m=+780.097185569" watchObservedRunningTime="2025-12-17 09:17:40.441710953 +0000 UTC m=+780.101551716" Dec 17 09:17:41 crc kubenswrapper[4935]: I1217 09:17:41.361406 4935 scope.go:117] "RemoveContainer" containerID="f4ecb29aa69ed7a4c7546208086eebf593112c86018a8769f01d335effc55a0a" Dec 17 09:17:41 crc kubenswrapper[4935]: I1217 09:17:41.428417 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jrmtf_8b52811a-aff2-43c1-9074-f0654f991d9c/kube-multus/2.log" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.475432 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-7z6d4"] Dec 17 09:17:43 crc kubenswrapper[4935]: E1217 09:17:43.475686 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50ec877-316c-4993-9906-7830748759d7" containerName="util" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.475699 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50ec877-316c-4993-9906-7830748759d7" containerName="util" Dec 17 09:17:43 crc kubenswrapper[4935]: E1217 09:17:43.475707 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50ec877-316c-4993-9906-7830748759d7" containerName="pull" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.475713 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50ec877-316c-4993-9906-7830748759d7" containerName="pull" Dec 17 09:17:43 crc kubenswrapper[4935]: E1217 09:17:43.475724 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d50ec877-316c-4993-9906-7830748759d7" containerName="extract" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.475730 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d50ec877-316c-4993-9906-7830748759d7" containerName="extract" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.475823 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="d50ec877-316c-4993-9906-7830748759d7" containerName="extract" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.476238 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-7z6d4" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.478003 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.478066 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.479049 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-wx8wg" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.490499 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-7z6d4"] Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.621121 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzbvg\" (UniqueName: \"kubernetes.io/projected/4fe03c14-1b10-4f5d-8107-3037bf3fd42e-kube-api-access-kzbvg\") pod \"nmstate-operator-6769fb99d-7z6d4\" (UID: \"4fe03c14-1b10-4f5d-8107-3037bf3fd42e\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-7z6d4" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.722454 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzbvg\" (UniqueName: \"kubernetes.io/projected/4fe03c14-1b10-4f5d-8107-3037bf3fd42e-kube-api-access-kzbvg\") pod \"nmstate-operator-6769fb99d-7z6d4\" (UID: \"4fe03c14-1b10-4f5d-8107-3037bf3fd42e\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-7z6d4" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.741855 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzbvg\" (UniqueName: \"kubernetes.io/projected/4fe03c14-1b10-4f5d-8107-3037bf3fd42e-kube-api-access-kzbvg\") pod \"nmstate-operator-6769fb99d-7z6d4\" (UID: \"4fe03c14-1b10-4f5d-8107-3037bf3fd42e\") " pod="openshift-nmstate/nmstate-operator-6769fb99d-7z6d4" Dec 17 09:17:43 crc kubenswrapper[4935]: I1217 09:17:43.792332 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-6769fb99d-7z6d4" Dec 17 09:17:44 crc kubenswrapper[4935]: I1217 09:17:44.006219 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-6769fb99d-7z6d4"] Dec 17 09:17:44 crc kubenswrapper[4935]: I1217 09:17:44.448854 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-7z6d4" event={"ID":"4fe03c14-1b10-4f5d-8107-3037bf3fd42e","Type":"ContainerStarted","Data":"fa64726371f6ef02aad79be7622ee886e4261430e3ced401bbcea840b969fea6"} Dec 17 09:17:45 crc kubenswrapper[4935]: I1217 09:17:45.743403 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:45 crc kubenswrapper[4935]: I1217 09:17:45.743727 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:45 crc kubenswrapper[4935]: I1217 09:17:45.810794 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:46 crc kubenswrapper[4935]: I1217 09:17:46.525478 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:47 crc kubenswrapper[4935]: I1217 09:17:47.467904 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-6769fb99d-7z6d4" event={"ID":"4fe03c14-1b10-4f5d-8107-3037bf3fd42e","Type":"ContainerStarted","Data":"46e896980898f1fbf4bb6cc4e4b6483a572a600ef1e25595ced11e70f251f571"} Dec 17 09:17:47 crc kubenswrapper[4935]: I1217 09:17:47.491802 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-6769fb99d-7z6d4" podStartSLOduration=1.6561899759999998 podStartE2EDuration="4.491772478s" podCreationTimestamp="2025-12-17 09:17:43 +0000 UTC" firstStartedPulling="2025-12-17 09:17:44.016170422 +0000 UTC m=+783.676011175" lastFinishedPulling="2025-12-17 09:17:46.851752914 +0000 UTC m=+786.511593677" observedRunningTime="2025-12-17 09:17:47.48732958 +0000 UTC m=+787.147170343" watchObservedRunningTime="2025-12-17 09:17:47.491772478 +0000 UTC m=+787.151613241" Dec 17 09:17:48 crc kubenswrapper[4935]: I1217 09:17:48.404228 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95pv6"] Dec 17 09:17:49 crc kubenswrapper[4935]: I1217 09:17:49.480631 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-95pv6" podUID="116128fe-0957-43c8-ad33-060397b889b5" containerName="registry-server" containerID="cri-o://fefbbe211f2be329bc3c5af0bfa260b56af48ff9ea4707fd43bd18d20ccd9ef2" gracePeriod=2 Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.494040 4935 generic.go:334] "Generic (PLEG): container finished" podID="116128fe-0957-43c8-ad33-060397b889b5" containerID="fefbbe211f2be329bc3c5af0bfa260b56af48ff9ea4707fd43bd18d20ccd9ef2" exitCode=0 Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.494163 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95pv6" event={"ID":"116128fe-0957-43c8-ad33-060397b889b5","Type":"ContainerDied","Data":"fefbbe211f2be329bc3c5af0bfa260b56af48ff9ea4707fd43bd18d20ccd9ef2"} Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.700608 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.848484 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-catalog-content\") pod \"116128fe-0957-43c8-ad33-060397b889b5\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.848552 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtm5w\" (UniqueName: \"kubernetes.io/projected/116128fe-0957-43c8-ad33-060397b889b5-kube-api-access-vtm5w\") pod \"116128fe-0957-43c8-ad33-060397b889b5\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.848683 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-utilities\") pod \"116128fe-0957-43c8-ad33-060397b889b5\" (UID: \"116128fe-0957-43c8-ad33-060397b889b5\") " Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.850006 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-utilities" (OuterVolumeSpecName: "utilities") pod "116128fe-0957-43c8-ad33-060397b889b5" (UID: "116128fe-0957-43c8-ad33-060397b889b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.857884 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/116128fe-0957-43c8-ad33-060397b889b5-kube-api-access-vtm5w" (OuterVolumeSpecName: "kube-api-access-vtm5w") pod "116128fe-0957-43c8-ad33-060397b889b5" (UID: "116128fe-0957-43c8-ad33-060397b889b5"). InnerVolumeSpecName "kube-api-access-vtm5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.950911 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.950983 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtm5w\" (UniqueName: \"kubernetes.io/projected/116128fe-0957-43c8-ad33-060397b889b5-kube-api-access-vtm5w\") on node \"crc\" DevicePath \"\"" Dec 17 09:17:51 crc kubenswrapper[4935]: I1217 09:17:51.993717 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "116128fe-0957-43c8-ad33-060397b889b5" (UID: "116128fe-0957-43c8-ad33-060397b889b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:17:52 crc kubenswrapper[4935]: I1217 09:17:52.052998 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/116128fe-0957-43c8-ad33-060397b889b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:17:52 crc kubenswrapper[4935]: I1217 09:17:52.503486 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-95pv6" event={"ID":"116128fe-0957-43c8-ad33-060397b889b5","Type":"ContainerDied","Data":"083d4dfbfaccfc672def4232247832e7c5ce2118fde1cfff635eb616fff41b81"} Dec 17 09:17:52 crc kubenswrapper[4935]: I1217 09:17:52.503570 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-95pv6" Dec 17 09:17:52 crc kubenswrapper[4935]: I1217 09:17:52.503623 4935 scope.go:117] "RemoveContainer" containerID="fefbbe211f2be329bc3c5af0bfa260b56af48ff9ea4707fd43bd18d20ccd9ef2" Dec 17 09:17:52 crc kubenswrapper[4935]: I1217 09:17:52.523096 4935 scope.go:117] "RemoveContainer" containerID="35109039b48a399df9b6c1083b198f7d1ad68aa0b960be46bf10a29637b13d66" Dec 17 09:17:52 crc kubenswrapper[4935]: I1217 09:17:52.544473 4935 scope.go:117] "RemoveContainer" containerID="1d60009ee24981e33c50064404686c5f30449f8b56517d823f9e1ef6548979f8" Dec 17 09:17:52 crc kubenswrapper[4935]: I1217 09:17:52.547397 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-95pv6"] Dec 17 09:17:52 crc kubenswrapper[4935]: I1217 09:17:52.555213 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-95pv6"] Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.131561 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="116128fe-0957-43c8-ad33-060397b889b5" path="/var/lib/kubelet/pods/116128fe-0957-43c8-ad33-060397b889b5/volumes" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.482995 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf"] Dec 17 09:17:53 crc kubenswrapper[4935]: E1217 09:17:53.483304 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="116128fe-0957-43c8-ad33-060397b889b5" containerName="extract-content" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.483319 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="116128fe-0957-43c8-ad33-060397b889b5" containerName="extract-content" Dec 17 09:17:53 crc kubenswrapper[4935]: E1217 09:17:53.483333 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="116128fe-0957-43c8-ad33-060397b889b5" containerName="extract-utilities" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.483339 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="116128fe-0957-43c8-ad33-060397b889b5" containerName="extract-utilities" Dec 17 09:17:53 crc kubenswrapper[4935]: E1217 09:17:53.483350 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="116128fe-0957-43c8-ad33-060397b889b5" containerName="registry-server" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.483356 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="116128fe-0957-43c8-ad33-060397b889b5" containerName="registry-server" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.483463 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="116128fe-0957-43c8-ad33-060397b889b5" containerName="registry-server" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.484080 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.490620 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-b78q8" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.496414 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-whnkc"] Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.499909 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.502736 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.510668 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf"] Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.525239 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-whnkc"] Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.539456 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-29rb4"] Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.547036 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.574100 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2mlj\" (UniqueName: \"kubernetes.io/projected/c0fd208a-4408-45b2-88f7-979bf751ada6-kube-api-access-r2mlj\") pod \"nmstate-metrics-7f7f7578db-n6kbf\" (UID: \"c0fd208a-4408-45b2-88f7-979bf751ada6\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.636841 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t"] Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.637877 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.644683 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-c4mjc" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.644870 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.644925 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.662547 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t"] Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.675964 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqzdc\" (UniqueName: \"kubernetes.io/projected/ffdacd08-7751-465d-a6f2-9037a7307280-kube-api-access-tqzdc\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.676155 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e4582acb-2858-4fab-8bc9-e8e6ee6589dd-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-whnkc\" (UID: \"e4582acb-2858-4fab-8bc9-e8e6ee6589dd\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.676240 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bpkr\" (UniqueName: \"kubernetes.io/projected/e4582acb-2858-4fab-8bc9-e8e6ee6589dd-kube-api-access-5bpkr\") pod \"nmstate-webhook-f8fb84555-whnkc\" (UID: \"e4582acb-2858-4fab-8bc9-e8e6ee6589dd\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.676355 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ffdacd08-7751-465d-a6f2-9037a7307280-dbus-socket\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.676455 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ffdacd08-7751-465d-a6f2-9037a7307280-ovs-socket\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.676554 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2mlj\" (UniqueName: \"kubernetes.io/projected/c0fd208a-4408-45b2-88f7-979bf751ada6-kube-api-access-r2mlj\") pod \"nmstate-metrics-7f7f7578db-n6kbf\" (UID: \"c0fd208a-4408-45b2-88f7-979bf751ada6\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.676630 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ffdacd08-7751-465d-a6f2-9037a7307280-nmstate-lock\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.697729 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2mlj\" (UniqueName: \"kubernetes.io/projected/c0fd208a-4408-45b2-88f7-979bf751ada6-kube-api-access-r2mlj\") pod \"nmstate-metrics-7f7f7578db-n6kbf\" (UID: \"c0fd208a-4408-45b2-88f7-979bf751ada6\") " pod="openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.778042 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ffdacd08-7751-465d-a6f2-9037a7307280-ovs-socket\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.778117 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ffdacd08-7751-465d-a6f2-9037a7307280-nmstate-lock\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.778207 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqzdc\" (UniqueName: \"kubernetes.io/projected/ffdacd08-7751-465d-a6f2-9037a7307280-kube-api-access-tqzdc\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.778352 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e4582acb-2858-4fab-8bc9-e8e6ee6589dd-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-whnkc\" (UID: \"e4582acb-2858-4fab-8bc9-e8e6ee6589dd\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.778238 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ffdacd08-7751-465d-a6f2-9037a7307280-ovs-socket\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.778441 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c10c7c6c-f801-40c4-bff9-0f7b740e662b-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-wd26t\" (UID: \"c10c7c6c-f801-40c4-bff9-0f7b740e662b\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.778301 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ffdacd08-7751-465d-a6f2-9037a7307280-nmstate-lock\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.778531 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bpkr\" (UniqueName: \"kubernetes.io/projected/e4582acb-2858-4fab-8bc9-e8e6ee6589dd-kube-api-access-5bpkr\") pod \"nmstate-webhook-f8fb84555-whnkc\" (UID: \"e4582acb-2858-4fab-8bc9-e8e6ee6589dd\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.778969 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ffdacd08-7751-465d-a6f2-9037a7307280-dbus-socket\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.779388 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ffdacd08-7751-465d-a6f2-9037a7307280-dbus-socket\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.780127 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzbjz\" (UniqueName: \"kubernetes.io/projected/c10c7c6c-f801-40c4-bff9-0f7b740e662b-kube-api-access-qzbjz\") pod \"nmstate-console-plugin-6ff7998486-wd26t\" (UID: \"c10c7c6c-f801-40c4-bff9-0f7b740e662b\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.780173 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c10c7c6c-f801-40c4-bff9-0f7b740e662b-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-wd26t\" (UID: \"c10c7c6c-f801-40c4-bff9-0f7b740e662b\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.782459 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e4582acb-2858-4fab-8bc9-e8e6ee6589dd-tls-key-pair\") pod \"nmstate-webhook-f8fb84555-whnkc\" (UID: \"e4582acb-2858-4fab-8bc9-e8e6ee6589dd\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.798861 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqzdc\" (UniqueName: \"kubernetes.io/projected/ffdacd08-7751-465d-a6f2-9037a7307280-kube-api-access-tqzdc\") pod \"nmstate-handler-29rb4\" (UID: \"ffdacd08-7751-465d-a6f2-9037a7307280\") " pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.801590 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bpkr\" (UniqueName: \"kubernetes.io/projected/e4582acb-2858-4fab-8bc9-e8e6ee6589dd-kube-api-access-5bpkr\") pod \"nmstate-webhook-f8fb84555-whnkc\" (UID: \"e4582acb-2858-4fab-8bc9-e8e6ee6589dd\") " pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.820816 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.820920 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-858544c664-76x7v"] Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.821650 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.829358 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.842842 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-858544c664-76x7v"] Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.863788 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.881781 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c10c7c6c-f801-40c4-bff9-0f7b740e662b-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-wd26t\" (UID: \"c10c7c6c-f801-40c4-bff9-0f7b740e662b\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.882025 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzbjz\" (UniqueName: \"kubernetes.io/projected/c10c7c6c-f801-40c4-bff9-0f7b740e662b-kube-api-access-qzbjz\") pod \"nmstate-console-plugin-6ff7998486-wd26t\" (UID: \"c10c7c6c-f801-40c4-bff9-0f7b740e662b\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.882811 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c10c7c6c-f801-40c4-bff9-0f7b740e662b-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-wd26t\" (UID: \"c10c7c6c-f801-40c4-bff9-0f7b740e662b\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:53 crc kubenswrapper[4935]: E1217 09:17:53.882972 4935 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.883013 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c10c7c6c-f801-40c4-bff9-0f7b740e662b-nginx-conf\") pod \"nmstate-console-plugin-6ff7998486-wd26t\" (UID: \"c10c7c6c-f801-40c4-bff9-0f7b740e662b\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:53 crc kubenswrapper[4935]: E1217 09:17:53.883051 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c10c7c6c-f801-40c4-bff9-0f7b740e662b-plugin-serving-cert podName:c10c7c6c-f801-40c4-bff9-0f7b740e662b nodeName:}" failed. No retries permitted until 2025-12-17 09:17:54.383031221 +0000 UTC m=+794.042871984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/c10c7c6c-f801-40c4-bff9-0f7b740e662b-plugin-serving-cert") pod "nmstate-console-plugin-6ff7998486-wd26t" (UID: "c10c7c6c-f801-40c4-bff9-0f7b740e662b") : secret "plugin-serving-cert" not found Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.903944 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzbjz\" (UniqueName: \"kubernetes.io/projected/c10c7c6c-f801-40c4-bff9-0f7b740e662b-kube-api-access-qzbjz\") pod \"nmstate-console-plugin-6ff7998486-wd26t\" (UID: \"c10c7c6c-f801-40c4-bff9-0f7b740e662b\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.984087 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31d61f85-98e7-44c2-8bdf-276559c01389-console-serving-cert\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.984153 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v89lm\" (UniqueName: \"kubernetes.io/projected/31d61f85-98e7-44c2-8bdf-276559c01389-kube-api-access-v89lm\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.984403 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-oauth-serving-cert\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.984465 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-trusted-ca-bundle\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.984596 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-service-ca\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.984646 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-console-config\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:53 crc kubenswrapper[4935]: I1217 09:17:53.984705 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31d61f85-98e7-44c2-8bdf-276559c01389-console-oauth-config\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.086522 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-console-config\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.087323 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31d61f85-98e7-44c2-8bdf-276559c01389-console-oauth-config\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.087394 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31d61f85-98e7-44c2-8bdf-276559c01389-console-serving-cert\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.087449 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v89lm\" (UniqueName: \"kubernetes.io/projected/31d61f85-98e7-44c2-8bdf-276559c01389-kube-api-access-v89lm\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.087602 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-oauth-serving-cert\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.087641 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-trusted-ca-bundle\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.087745 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-console-config\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.087756 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-service-ca\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.088577 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-service-ca\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.088663 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-oauth-serving-cert\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.089149 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/31d61f85-98e7-44c2-8bdf-276559c01389-trusted-ca-bundle\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.094844 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/31d61f85-98e7-44c2-8bdf-276559c01389-console-oauth-config\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.117351 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-f8fb84555-whnkc"] Dec 17 09:17:54 crc kubenswrapper[4935]: W1217 09:17:54.123728 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4582acb_2858_4fab_8bc9_e8e6ee6589dd.slice/crio-2d7b889da927df1d2c87e20fdf363a1dd4ea0726303bb3b11f4aa4970f3fc0f8 WatchSource:0}: Error finding container 2d7b889da927df1d2c87e20fdf363a1dd4ea0726303bb3b11f4aa4970f3fc0f8: Status 404 returned error can't find the container with id 2d7b889da927df1d2c87e20fdf363a1dd4ea0726303bb3b11f4aa4970f3fc0f8 Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.157956 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf"] Dec 17 09:17:54 crc kubenswrapper[4935]: W1217 09:17:54.160464 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0fd208a_4408_45b2_88f7_979bf751ada6.slice/crio-de8baf0932a292cf4a70edb637119f1835c6e674ccdefb80bc3c6b2e76607c35 WatchSource:0}: Error finding container de8baf0932a292cf4a70edb637119f1835c6e674ccdefb80bc3c6b2e76607c35: Status 404 returned error can't find the container with id de8baf0932a292cf4a70edb637119f1835c6e674ccdefb80bc3c6b2e76607c35 Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.176034 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/31d61f85-98e7-44c2-8bdf-276559c01389-console-serving-cert\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.178927 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v89lm\" (UniqueName: \"kubernetes.io/projected/31d61f85-98e7-44c2-8bdf-276559c01389-kube-api-access-v89lm\") pod \"console-858544c664-76x7v\" (UID: \"31d61f85-98e7-44c2-8bdf-276559c01389\") " pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.179259 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-858544c664-76x7v" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.360806 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-858544c664-76x7v"] Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.393427 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c10c7c6c-f801-40c4-bff9-0f7b740e662b-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-wd26t\" (UID: \"c10c7c6c-f801-40c4-bff9-0f7b740e662b\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.398298 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c10c7c6c-f801-40c4-bff9-0f7b740e662b-plugin-serving-cert\") pod \"nmstate-console-plugin-6ff7998486-wd26t\" (UID: \"c10c7c6c-f801-40c4-bff9-0f7b740e662b\") " pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.532402 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-29rb4" event={"ID":"ffdacd08-7751-465d-a6f2-9037a7307280","Type":"ContainerStarted","Data":"e772032de6e877e71a3d27f59f82397c1421c8db8a7e4e004425b2e4538dd614"} Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.534487 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858544c664-76x7v" event={"ID":"31d61f85-98e7-44c2-8bdf-276559c01389","Type":"ContainerStarted","Data":"51ceaf9b119f6fafb34f0d39285a034994bf0b6d37c1fbec8248b052c5ad8d6e"} Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.534530 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-858544c664-76x7v" event={"ID":"31d61f85-98e7-44c2-8bdf-276559c01389","Type":"ContainerStarted","Data":"98fe10954a43c2202679bb2c7aa72ca28d54e14e6d4ba78e0c184ad660882b0a"} Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.535938 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" event={"ID":"e4582acb-2858-4fab-8bc9-e8e6ee6589dd","Type":"ContainerStarted","Data":"2d7b889da927df1d2c87e20fdf363a1dd4ea0726303bb3b11f4aa4970f3fc0f8"} Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.537015 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf" event={"ID":"c0fd208a-4408-45b2-88f7-979bf751ada6","Type":"ContainerStarted","Data":"de8baf0932a292cf4a70edb637119f1835c6e674ccdefb80bc3c6b2e76607c35"} Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.552253 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.557504 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-858544c664-76x7v" podStartSLOduration=1.557469405 podStartE2EDuration="1.557469405s" podCreationTimestamp="2025-12-17 09:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:17:54.556398129 +0000 UTC m=+794.216238912" watchObservedRunningTime="2025-12-17 09:17:54.557469405 +0000 UTC m=+794.217310168" Dec 17 09:17:54 crc kubenswrapper[4935]: I1217 09:17:54.745043 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t"] Dec 17 09:17:54 crc kubenswrapper[4935]: W1217 09:17:54.752548 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc10c7c6c_f801_40c4_bff9_0f7b740e662b.slice/crio-904e1cb99e2e0ebe16a1fd3d8af145cec8172dfd200d78b655ea7681c0fe1268 WatchSource:0}: Error finding container 904e1cb99e2e0ebe16a1fd3d8af145cec8172dfd200d78b655ea7681c0fe1268: Status 404 returned error can't find the container with id 904e1cb99e2e0ebe16a1fd3d8af145cec8172dfd200d78b655ea7681c0fe1268 Dec 17 09:17:55 crc kubenswrapper[4935]: I1217 09:17:55.546217 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" event={"ID":"c10c7c6c-f801-40c4-bff9-0f7b740e662b","Type":"ContainerStarted","Data":"904e1cb99e2e0ebe16a1fd3d8af145cec8172dfd200d78b655ea7681c0fe1268"} Dec 17 09:17:56 crc kubenswrapper[4935]: I1217 09:17:56.556322 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf" event={"ID":"c0fd208a-4408-45b2-88f7-979bf751ada6","Type":"ContainerStarted","Data":"90abc2a89467aba7aeac35dbe7b830b4d755f985e502ae6508bf0f586346735e"} Dec 17 09:17:56 crc kubenswrapper[4935]: I1217 09:17:56.559463 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" event={"ID":"e4582acb-2858-4fab-8bc9-e8e6ee6589dd","Type":"ContainerStarted","Data":"d1f8dc561c84d1f5f34a090377b664181515cdbb5aa74e7b750cde07550868b7"} Dec 17 09:17:56 crc kubenswrapper[4935]: I1217 09:17:56.560020 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" Dec 17 09:17:56 crc kubenswrapper[4935]: I1217 09:17:56.590667 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" podStartSLOduration=1.3506935279999999 podStartE2EDuration="3.590638389s" podCreationTimestamp="2025-12-17 09:17:53 +0000 UTC" firstStartedPulling="2025-12-17 09:17:54.12631469 +0000 UTC m=+793.786155453" lastFinishedPulling="2025-12-17 09:17:56.366259541 +0000 UTC m=+796.026100314" observedRunningTime="2025-12-17 09:17:56.582289785 +0000 UTC m=+796.242130548" watchObservedRunningTime="2025-12-17 09:17:56.590638389 +0000 UTC m=+796.250479152" Dec 17 09:17:57 crc kubenswrapper[4935]: I1217 09:17:57.572403 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" event={"ID":"c10c7c6c-f801-40c4-bff9-0f7b740e662b","Type":"ContainerStarted","Data":"b241beb5ad4f8a6a78c2c4bba1e6bdd003e8762e7afe59ea14208b8593fca3ed"} Dec 17 09:17:57 crc kubenswrapper[4935]: I1217 09:17:57.574198 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-29rb4" event={"ID":"ffdacd08-7751-465d-a6f2-9037a7307280","Type":"ContainerStarted","Data":"931e8cae0a41914e61fdc6daa07ca1f476bb5386d71240dc7bc4806f6c379eb1"} Dec 17 09:17:57 crc kubenswrapper[4935]: I1217 09:17:57.574420 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:17:57 crc kubenswrapper[4935]: I1217 09:17:57.588112 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6ff7998486-wd26t" podStartSLOduration=2.024044624 podStartE2EDuration="4.588084508s" podCreationTimestamp="2025-12-17 09:17:53 +0000 UTC" firstStartedPulling="2025-12-17 09:17:54.756171815 +0000 UTC m=+794.416012578" lastFinishedPulling="2025-12-17 09:17:57.320211699 +0000 UTC m=+796.980052462" observedRunningTime="2025-12-17 09:17:57.584662705 +0000 UTC m=+797.244503478" watchObservedRunningTime="2025-12-17 09:17:57.588084508 +0000 UTC m=+797.247925271" Dec 17 09:17:57 crc kubenswrapper[4935]: I1217 09:17:57.603944 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-29rb4" podStartSLOduration=2.16480722 podStartE2EDuration="4.603924004s" podCreationTimestamp="2025-12-17 09:17:53 +0000 UTC" firstStartedPulling="2025-12-17 09:17:53.904502334 +0000 UTC m=+793.564343097" lastFinishedPulling="2025-12-17 09:17:56.343619118 +0000 UTC m=+796.003459881" observedRunningTime="2025-12-17 09:17:57.603554566 +0000 UTC m=+797.263395329" watchObservedRunningTime="2025-12-17 09:17:57.603924004 +0000 UTC m=+797.263764767" Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.130658 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.131319 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.131385 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.132291 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0452e123d683b48fb5da02863983a6b9a8cd0b2246d43611082e994f7f11e20b"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.132367 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://0452e123d683b48fb5da02863983a6b9a8cd0b2246d43611082e994f7f11e20b" gracePeriod=600 Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.596430 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="0452e123d683b48fb5da02863983a6b9a8cd0b2246d43611082e994f7f11e20b" exitCode=0 Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.596510 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"0452e123d683b48fb5da02863983a6b9a8cd0b2246d43611082e994f7f11e20b"} Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.596548 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"4beda71bb40d3ba3cd8691bc4fb1531ee2f28ecc5ac8964f7b4a375581ddcde6"} Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.596571 4935 scope.go:117] "RemoveContainer" containerID="c0b861ed431cd38a53635da47f417dba930fef91f14b1f917781fd9245bf2b9b" Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.601911 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf" event={"ID":"c0fd208a-4408-45b2-88f7-979bf751ada6","Type":"ContainerStarted","Data":"c77a90ba3a2770d446b93f6925194c18a034587d46d7fc0d1be26727361354b6"} Dec 17 09:18:00 crc kubenswrapper[4935]: I1217 09:18:00.659526 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f7f7578db-n6kbf" podStartSLOduration=1.7224749639999999 podStartE2EDuration="7.659497528s" podCreationTimestamp="2025-12-17 09:17:53 +0000 UTC" firstStartedPulling="2025-12-17 09:17:54.162980455 +0000 UTC m=+793.822821218" lastFinishedPulling="2025-12-17 09:18:00.100003019 +0000 UTC m=+799.759843782" observedRunningTime="2025-12-17 09:18:00.658418031 +0000 UTC m=+800.318258794" watchObservedRunningTime="2025-12-17 09:18:00.659497528 +0000 UTC m=+800.319338291" Dec 17 09:18:03 crc kubenswrapper[4935]: I1217 09:18:03.898087 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-29rb4" Dec 17 09:18:04 crc kubenswrapper[4935]: I1217 09:18:04.179889 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-858544c664-76x7v" Dec 17 09:18:04 crc kubenswrapper[4935]: I1217 09:18:04.180072 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-858544c664-76x7v" Dec 17 09:18:04 crc kubenswrapper[4935]: I1217 09:18:04.186289 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-858544c664-76x7v" Dec 17 09:18:04 crc kubenswrapper[4935]: I1217 09:18:04.635072 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-858544c664-76x7v" Dec 17 09:18:04 crc kubenswrapper[4935]: I1217 09:18:04.694638 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nw6k6"] Dec 17 09:18:13 crc kubenswrapper[4935]: I1217 09:18:13.838669 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-f8fb84555-whnkc" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.769670 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6"] Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.772737 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.774620 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.780362 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6"] Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.791351 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.791414 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.791498 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln9sb\" (UniqueName: \"kubernetes.io/projected/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-kube-api-access-ln9sb\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.892792 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.892876 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.892967 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln9sb\" (UniqueName: \"kubernetes.io/projected/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-kube-api-access-ln9sb\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.893422 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-bundle\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.893876 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-util\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:27 crc kubenswrapper[4935]: I1217 09:18:27.913903 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln9sb\" (UniqueName: \"kubernetes.io/projected/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-kube-api-access-ln9sb\") pod \"5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:28 crc kubenswrapper[4935]: I1217 09:18:28.098368 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:28 crc kubenswrapper[4935]: I1217 09:18:28.507162 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6"] Dec 17 09:18:28 crc kubenswrapper[4935]: I1217 09:18:28.816483 4935 generic.go:334] "Generic (PLEG): container finished" podID="17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" containerID="5bee0acf15af48786974e1fc41a274c8da145a534c0eabb2947ef0b06569c772" exitCode=0 Dec 17 09:18:28 crc kubenswrapper[4935]: I1217 09:18:28.816559 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" event={"ID":"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d","Type":"ContainerDied","Data":"5bee0acf15af48786974e1fc41a274c8da145a534c0eabb2947ef0b06569c772"} Dec 17 09:18:28 crc kubenswrapper[4935]: I1217 09:18:28.816604 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" event={"ID":"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d","Type":"ContainerStarted","Data":"7808a509d60550e4292d60a705847003ba9d3449a26c425e7e8fde6400ab8315"} Dec 17 09:18:29 crc kubenswrapper[4935]: I1217 09:18:29.741577 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-nw6k6" podUID="a074b884-bf31-47dc-9257-41a7d4dda13e" containerName="console" containerID="cri-o://435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed" gracePeriod=15 Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.113530 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nw6k6_a074b884-bf31-47dc-9257-41a7d4dda13e/console/0.log" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.113968 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.223962 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-oauth-serving-cert\") pod \"a074b884-bf31-47dc-9257-41a7d4dda13e\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.224087 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-trusted-ca-bundle\") pod \"a074b884-bf31-47dc-9257-41a7d4dda13e\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.224117 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-serving-cert\") pod \"a074b884-bf31-47dc-9257-41a7d4dda13e\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.224135 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-oauth-config\") pod \"a074b884-bf31-47dc-9257-41a7d4dda13e\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.224190 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-console-config\") pod \"a074b884-bf31-47dc-9257-41a7d4dda13e\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.224215 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-service-ca\") pod \"a074b884-bf31-47dc-9257-41a7d4dda13e\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.224259 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljpsw\" (UniqueName: \"kubernetes.io/projected/a074b884-bf31-47dc-9257-41a7d4dda13e-kube-api-access-ljpsw\") pod \"a074b884-bf31-47dc-9257-41a7d4dda13e\" (UID: \"a074b884-bf31-47dc-9257-41a7d4dda13e\") " Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.225447 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a074b884-bf31-47dc-9257-41a7d4dda13e" (UID: "a074b884-bf31-47dc-9257-41a7d4dda13e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.225465 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a074b884-bf31-47dc-9257-41a7d4dda13e" (UID: "a074b884-bf31-47dc-9257-41a7d4dda13e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.225459 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-console-config" (OuterVolumeSpecName: "console-config") pod "a074b884-bf31-47dc-9257-41a7d4dda13e" (UID: "a074b884-bf31-47dc-9257-41a7d4dda13e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.225623 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-service-ca" (OuterVolumeSpecName: "service-ca") pod "a074b884-bf31-47dc-9257-41a7d4dda13e" (UID: "a074b884-bf31-47dc-9257-41a7d4dda13e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.231487 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a074b884-bf31-47dc-9257-41a7d4dda13e" (UID: "a074b884-bf31-47dc-9257-41a7d4dda13e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.232327 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a074b884-bf31-47dc-9257-41a7d4dda13e-kube-api-access-ljpsw" (OuterVolumeSpecName: "kube-api-access-ljpsw") pod "a074b884-bf31-47dc-9257-41a7d4dda13e" (UID: "a074b884-bf31-47dc-9257-41a7d4dda13e"). InnerVolumeSpecName "kube-api-access-ljpsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.232681 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a074b884-bf31-47dc-9257-41a7d4dda13e" (UID: "a074b884-bf31-47dc-9257-41a7d4dda13e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.326385 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljpsw\" (UniqueName: \"kubernetes.io/projected/a074b884-bf31-47dc-9257-41a7d4dda13e-kube-api-access-ljpsw\") on node \"crc\" DevicePath \"\"" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.326526 4935 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.326582 4935 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.326631 4935 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.326685 4935 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a074b884-bf31-47dc-9257-41a7d4dda13e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.326740 4935 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-console-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.326789 4935 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a074b884-bf31-47dc-9257-41a7d4dda13e-service-ca\") on node \"crc\" DevicePath \"\"" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.830449 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-nw6k6_a074b884-bf31-47dc-9257-41a7d4dda13e/console/0.log" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.830550 4935 generic.go:334] "Generic (PLEG): container finished" podID="a074b884-bf31-47dc-9257-41a7d4dda13e" containerID="435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed" exitCode=2 Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.830624 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nw6k6" event={"ID":"a074b884-bf31-47dc-9257-41a7d4dda13e","Type":"ContainerDied","Data":"435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed"} Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.830664 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-nw6k6" event={"ID":"a074b884-bf31-47dc-9257-41a7d4dda13e","Type":"ContainerDied","Data":"380aa846562ba6cde2a760c3333d15d04b8441abbdcf855ed87b9775d4b7f36a"} Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.830687 4935 scope.go:117] "RemoveContainer" containerID="435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.830825 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-nw6k6" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.836046 4935 generic.go:334] "Generic (PLEG): container finished" podID="17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" containerID="4ecd403cdbee51c324a732ec5f3079276baf16ec056182e27a6c91bb52b2ad70" exitCode=0 Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.836089 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" event={"ID":"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d","Type":"ContainerDied","Data":"4ecd403cdbee51c324a732ec5f3079276baf16ec056182e27a6c91bb52b2ad70"} Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.865592 4935 scope.go:117] "RemoveContainer" containerID="435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed" Dec 17 09:18:30 crc kubenswrapper[4935]: E1217 09:18:30.866213 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed\": container with ID starting with 435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed not found: ID does not exist" containerID="435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.866319 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed"} err="failed to get container status \"435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed\": rpc error: code = NotFound desc = could not find container \"435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed\": container with ID starting with 435db58345877bd6109170520417b653d28ba35b556ce3249912765843d6f0ed not found: ID does not exist" Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.876046 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-nw6k6"] Dec 17 09:18:30 crc kubenswrapper[4935]: I1217 09:18:30.879192 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-nw6k6"] Dec 17 09:18:31 crc kubenswrapper[4935]: I1217 09:18:31.135318 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a074b884-bf31-47dc-9257-41a7d4dda13e" path="/var/lib/kubelet/pods/a074b884-bf31-47dc-9257-41a7d4dda13e/volumes" Dec 17 09:18:31 crc kubenswrapper[4935]: I1217 09:18:31.844827 4935 generic.go:334] "Generic (PLEG): container finished" podID="17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" containerID="b1e3ac7b6d2be73815dfdd899d68395df646104285777325c6918eac999eda1c" exitCode=0 Dec 17 09:18:31 crc kubenswrapper[4935]: I1217 09:18:31.844875 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" event={"ID":"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d","Type":"ContainerDied","Data":"b1e3ac7b6d2be73815dfdd899d68395df646104285777325c6918eac999eda1c"} Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.061780 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.264124 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ln9sb\" (UniqueName: \"kubernetes.io/projected/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-kube-api-access-ln9sb\") pod \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.264177 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-util\") pod \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.264219 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-bundle\") pod \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\" (UID: \"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d\") " Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.265499 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-bundle" (OuterVolumeSpecName: "bundle") pod "17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" (UID: "17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.270257 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-kube-api-access-ln9sb" (OuterVolumeSpecName: "kube-api-access-ln9sb") pod "17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" (UID: "17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d"). InnerVolumeSpecName "kube-api-access-ln9sb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.281610 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-util" (OuterVolumeSpecName: "util") pod "17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" (UID: "17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.365702 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ln9sb\" (UniqueName: \"kubernetes.io/projected/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-kube-api-access-ln9sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.365736 4935 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-util\") on node \"crc\" DevicePath \"\"" Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.365843 4935 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.858142 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" event={"ID":"17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d","Type":"ContainerDied","Data":"7808a509d60550e4292d60a705847003ba9d3449a26c425e7e8fde6400ab8315"} Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.858196 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7808a509d60550e4292d60a705847003ba9d3449a26c425e7e8fde6400ab8315" Dec 17 09:18:33 crc kubenswrapper[4935]: I1217 09:18:33.858197 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.382206 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7"] Dec 17 09:18:42 crc kubenswrapper[4935]: E1217 09:18:42.383185 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" containerName="extract" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.383207 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" containerName="extract" Dec 17 09:18:42 crc kubenswrapper[4935]: E1217 09:18:42.383226 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" containerName="pull" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.383234 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" containerName="pull" Dec 17 09:18:42 crc kubenswrapper[4935]: E1217 09:18:42.383245 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a074b884-bf31-47dc-9257-41a7d4dda13e" containerName="console" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.383253 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="a074b884-bf31-47dc-9257-41a7d4dda13e" containerName="console" Dec 17 09:18:42 crc kubenswrapper[4935]: E1217 09:18:42.383261 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" containerName="util" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.383292 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" containerName="util" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.383431 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d" containerName="extract" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.383455 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="a074b884-bf31-47dc-9257-41a7d4dda13e" containerName="console" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.383947 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.386974 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.387221 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.387473 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zxsc2" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.387642 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.395625 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.414774 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7"] Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.551164 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1068b00-4182-43a7-aa77-e2521de014b7-webhook-cert\") pod \"metallb-operator-controller-manager-64d8b49b46-7fst7\" (UID: \"d1068b00-4182-43a7-aa77-e2521de014b7\") " pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.551223 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldc8c\" (UniqueName: \"kubernetes.io/projected/d1068b00-4182-43a7-aa77-e2521de014b7-kube-api-access-ldc8c\") pod \"metallb-operator-controller-manager-64d8b49b46-7fst7\" (UID: \"d1068b00-4182-43a7-aa77-e2521de014b7\") " pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.551250 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1068b00-4182-43a7-aa77-e2521de014b7-apiservice-cert\") pod \"metallb-operator-controller-manager-64d8b49b46-7fst7\" (UID: \"d1068b00-4182-43a7-aa77-e2521de014b7\") " pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.636017 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c"] Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.636812 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.639176 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-fs46x" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.639692 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.641952 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.652863 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldc8c\" (UniqueName: \"kubernetes.io/projected/d1068b00-4182-43a7-aa77-e2521de014b7-kube-api-access-ldc8c\") pod \"metallb-operator-controller-manager-64d8b49b46-7fst7\" (UID: \"d1068b00-4182-43a7-aa77-e2521de014b7\") " pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.653135 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1068b00-4182-43a7-aa77-e2521de014b7-apiservice-cert\") pod \"metallb-operator-controller-manager-64d8b49b46-7fst7\" (UID: \"d1068b00-4182-43a7-aa77-e2521de014b7\") " pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.653382 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1068b00-4182-43a7-aa77-e2521de014b7-webhook-cert\") pod \"metallb-operator-controller-manager-64d8b49b46-7fst7\" (UID: \"d1068b00-4182-43a7-aa77-e2521de014b7\") " pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.657074 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c"] Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.665477 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d1068b00-4182-43a7-aa77-e2521de014b7-apiservice-cert\") pod \"metallb-operator-controller-manager-64d8b49b46-7fst7\" (UID: \"d1068b00-4182-43a7-aa77-e2521de014b7\") " pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.679992 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d1068b00-4182-43a7-aa77-e2521de014b7-webhook-cert\") pod \"metallb-operator-controller-manager-64d8b49b46-7fst7\" (UID: \"d1068b00-4182-43a7-aa77-e2521de014b7\") " pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.684370 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldc8c\" (UniqueName: \"kubernetes.io/projected/d1068b00-4182-43a7-aa77-e2521de014b7-kube-api-access-ldc8c\") pod \"metallb-operator-controller-manager-64d8b49b46-7fst7\" (UID: \"d1068b00-4182-43a7-aa77-e2521de014b7\") " pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.700954 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.755214 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7aeed44-1b4a-4d58-ac7f-077576b37887-apiservice-cert\") pod \"metallb-operator-webhook-server-6c67684f-qpd2c\" (UID: \"a7aeed44-1b4a-4d58-ac7f-077576b37887\") " pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.755332 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7aeed44-1b4a-4d58-ac7f-077576b37887-webhook-cert\") pod \"metallb-operator-webhook-server-6c67684f-qpd2c\" (UID: \"a7aeed44-1b4a-4d58-ac7f-077576b37887\") " pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.755364 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvghm\" (UniqueName: \"kubernetes.io/projected/a7aeed44-1b4a-4d58-ac7f-077576b37887-kube-api-access-qvghm\") pod \"metallb-operator-webhook-server-6c67684f-qpd2c\" (UID: \"a7aeed44-1b4a-4d58-ac7f-077576b37887\") " pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.856457 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7aeed44-1b4a-4d58-ac7f-077576b37887-apiservice-cert\") pod \"metallb-operator-webhook-server-6c67684f-qpd2c\" (UID: \"a7aeed44-1b4a-4d58-ac7f-077576b37887\") " pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.856559 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7aeed44-1b4a-4d58-ac7f-077576b37887-webhook-cert\") pod \"metallb-operator-webhook-server-6c67684f-qpd2c\" (UID: \"a7aeed44-1b4a-4d58-ac7f-077576b37887\") " pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.856588 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvghm\" (UniqueName: \"kubernetes.io/projected/a7aeed44-1b4a-4d58-ac7f-077576b37887-kube-api-access-qvghm\") pod \"metallb-operator-webhook-server-6c67684f-qpd2c\" (UID: \"a7aeed44-1b4a-4d58-ac7f-077576b37887\") " pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.866799 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a7aeed44-1b4a-4d58-ac7f-077576b37887-apiservice-cert\") pod \"metallb-operator-webhook-server-6c67684f-qpd2c\" (UID: \"a7aeed44-1b4a-4d58-ac7f-077576b37887\") " pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.866989 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a7aeed44-1b4a-4d58-ac7f-077576b37887-webhook-cert\") pod \"metallb-operator-webhook-server-6c67684f-qpd2c\" (UID: \"a7aeed44-1b4a-4d58-ac7f-077576b37887\") " pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.882108 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvghm\" (UniqueName: \"kubernetes.io/projected/a7aeed44-1b4a-4d58-ac7f-077576b37887-kube-api-access-qvghm\") pod \"metallb-operator-webhook-server-6c67684f-qpd2c\" (UID: \"a7aeed44-1b4a-4d58-ac7f-077576b37887\") " pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:42 crc kubenswrapper[4935]: I1217 09:18:42.958794 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:43 crc kubenswrapper[4935]: I1217 09:18:43.149478 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7"] Dec 17 09:18:43 crc kubenswrapper[4935]: I1217 09:18:43.252009 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c"] Dec 17 09:18:43 crc kubenswrapper[4935]: W1217 09:18:43.256546 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7aeed44_1b4a_4d58_ac7f_077576b37887.slice/crio-bf1a91ad2fdd2bf087a1f8db353c4213a2bb462f22f2b3fc3aefcbf6346dc9d6 WatchSource:0}: Error finding container bf1a91ad2fdd2bf087a1f8db353c4213a2bb462f22f2b3fc3aefcbf6346dc9d6: Status 404 returned error can't find the container with id bf1a91ad2fdd2bf087a1f8db353c4213a2bb462f22f2b3fc3aefcbf6346dc9d6 Dec 17 09:18:43 crc kubenswrapper[4935]: I1217 09:18:43.934092 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" event={"ID":"d1068b00-4182-43a7-aa77-e2521de014b7","Type":"ContainerStarted","Data":"3a994a5bd40ce4bd0586a8260aa3ecf73c7edd84c4592b2fa0603f8063344dfa"} Dec 17 09:18:43 crc kubenswrapper[4935]: I1217 09:18:43.937247 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" event={"ID":"a7aeed44-1b4a-4d58-ac7f-077576b37887","Type":"ContainerStarted","Data":"bf1a91ad2fdd2bf087a1f8db353c4213a2bb462f22f2b3fc3aefcbf6346dc9d6"} Dec 17 09:18:50 crc kubenswrapper[4935]: I1217 09:18:50.994187 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" event={"ID":"a7aeed44-1b4a-4d58-ac7f-077576b37887","Type":"ContainerStarted","Data":"f897d4e0ee89852e47d19d56ae9e029cdb6a244a711b94477b7345d53e0b8e2f"} Dec 17 09:18:50 crc kubenswrapper[4935]: I1217 09:18:50.994921 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:18:50 crc kubenswrapper[4935]: I1217 09:18:50.996723 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" event={"ID":"d1068b00-4182-43a7-aa77-e2521de014b7","Type":"ContainerStarted","Data":"69a3f066e9de0765a53834958f6527055f4ac31cd7e00ca8318f780c8da5f250"} Dec 17 09:18:50 crc kubenswrapper[4935]: I1217 09:18:50.996880 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:18:51 crc kubenswrapper[4935]: I1217 09:18:51.021502 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" podStartSLOduration=2.290111705 podStartE2EDuration="9.021471561s" podCreationTimestamp="2025-12-17 09:18:42 +0000 UTC" firstStartedPulling="2025-12-17 09:18:43.264875777 +0000 UTC m=+842.924716540" lastFinishedPulling="2025-12-17 09:18:49.996235633 +0000 UTC m=+849.656076396" observedRunningTime="2025-12-17 09:18:51.018865018 +0000 UTC m=+850.678705811" watchObservedRunningTime="2025-12-17 09:18:51.021471561 +0000 UTC m=+850.681312324" Dec 17 09:18:51 crc kubenswrapper[4935]: I1217 09:18:51.045388 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" podStartSLOduration=2.23540391 podStartE2EDuration="9.045361204s" podCreationTimestamp="2025-12-17 09:18:42 +0000 UTC" firstStartedPulling="2025-12-17 09:18:43.170115294 +0000 UTC m=+842.829956047" lastFinishedPulling="2025-12-17 09:18:49.980072578 +0000 UTC m=+849.639913341" observedRunningTime="2025-12-17 09:18:51.039518552 +0000 UTC m=+850.699359325" watchObservedRunningTime="2025-12-17 09:18:51.045361204 +0000 UTC m=+850.705201967" Dec 17 09:19:02 crc kubenswrapper[4935]: I1217 09:19:02.963939 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6c67684f-qpd2c" Dec 17 09:19:22 crc kubenswrapper[4935]: I1217 09:19:22.704637 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-64d8b49b46-7fst7" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.507235 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92"] Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.508874 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.511457 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-k268q"] Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.512006 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-fg7fv" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.515248 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.518547 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92"] Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.520571 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.520656 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.521057 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.554576 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vsr7\" (UniqueName: \"kubernetes.io/projected/88717473-e7d0-4a23-b03b-5cada6284ad1-kube-api-access-9vsr7\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.554651 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88717473-e7d0-4a23-b03b-5cada6284ad1-metrics-certs\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.554674 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjpk6\" (UniqueName: \"kubernetes.io/projected/4331fce4-cd29-4cfc-90c0-45a97c6596a4-kube-api-access-kjpk6\") pod \"frr-k8s-webhook-server-7784b6fcf-7pk92\" (UID: \"4331fce4-cd29-4cfc-90c0-45a97c6596a4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.554701 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4331fce4-cd29-4cfc-90c0-45a97c6596a4-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-7pk92\" (UID: \"4331fce4-cd29-4cfc-90c0-45a97c6596a4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.554741 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-metrics\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.554782 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/88717473-e7d0-4a23-b03b-5cada6284ad1-frr-startup\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.554814 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-reloader\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.554835 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-frr-conf\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.554862 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-frr-sockets\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.601910 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-z8lgk"] Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.603157 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.607816 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.607972 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-z6mn7" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.608205 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.608716 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.628245 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5bddd4b946-sg456"] Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.629345 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.637441 4935 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.653596 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-sg456"] Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.656227 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vsr7\" (UniqueName: \"kubernetes.io/projected/88717473-e7d0-4a23-b03b-5cada6284ad1-kube-api-access-9vsr7\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.656263 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88717473-e7d0-4a23-b03b-5cada6284ad1-metrics-certs\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.656308 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4331fce4-cd29-4cfc-90c0-45a97c6596a4-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-7pk92\" (UID: \"4331fce4-cd29-4cfc-90c0-45a97c6596a4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.656349 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjpk6\" (UniqueName: \"kubernetes.io/projected/4331fce4-cd29-4cfc-90c0-45a97c6596a4-kube-api-access-kjpk6\") pod \"frr-k8s-webhook-server-7784b6fcf-7pk92\" (UID: \"4331fce4-cd29-4cfc-90c0-45a97c6596a4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.656395 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-metrics\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.656599 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/88717473-e7d0-4a23-b03b-5cada6284ad1-frr-startup\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.656636 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-reloader\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.656656 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-frr-conf\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.656676 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-frr-sockets\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.657269 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-frr-sockets\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.657859 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-reloader\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.658015 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-metrics\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.658305 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/88717473-e7d0-4a23-b03b-5cada6284ad1-frr-conf\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.658830 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/88717473-e7d0-4a23-b03b-5cada6284ad1-frr-startup\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.665817 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4331fce4-cd29-4cfc-90c0-45a97c6596a4-cert\") pod \"frr-k8s-webhook-server-7784b6fcf-7pk92\" (UID: \"4331fce4-cd29-4cfc-90c0-45a97c6596a4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.667944 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/88717473-e7d0-4a23-b03b-5cada6284ad1-metrics-certs\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.683616 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjpk6\" (UniqueName: \"kubernetes.io/projected/4331fce4-cd29-4cfc-90c0-45a97c6596a4-kube-api-access-kjpk6\") pod \"frr-k8s-webhook-server-7784b6fcf-7pk92\" (UID: \"4331fce4-cd29-4cfc-90c0-45a97c6596a4\") " pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.684229 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vsr7\" (UniqueName: \"kubernetes.io/projected/88717473-e7d0-4a23-b03b-5cada6284ad1-kube-api-access-9vsr7\") pod \"frr-k8s-k268q\" (UID: \"88717473-e7d0-4a23-b03b-5cada6284ad1\") " pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.757876 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c162bc-0446-4ce3-a601-3fb687465161-cert\") pod \"controller-5bddd4b946-sg456\" (UID: \"30c162bc-0446-4ce3-a601-3fb687465161\") " pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.757929 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/137edfb5-6e98-4aef-8a75-bf14297a7d3d-metallb-excludel2\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.758011 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-memberlist\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.758056 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30c162bc-0446-4ce3-a601-3fb687465161-metrics-certs\") pod \"controller-5bddd4b946-sg456\" (UID: \"30c162bc-0446-4ce3-a601-3fb687465161\") " pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.758124 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmckc\" (UniqueName: \"kubernetes.io/projected/137edfb5-6e98-4aef-8a75-bf14297a7d3d-kube-api-access-tmckc\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.758154 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-metrics-certs\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.758177 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfxjh\" (UniqueName: \"kubernetes.io/projected/30c162bc-0446-4ce3-a601-3fb687465161-kube-api-access-zfxjh\") pod \"controller-5bddd4b946-sg456\" (UID: \"30c162bc-0446-4ce3-a601-3fb687465161\") " pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.829317 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.837387 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.863168 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-memberlist\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.863233 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30c162bc-0446-4ce3-a601-3fb687465161-metrics-certs\") pod \"controller-5bddd4b946-sg456\" (UID: \"30c162bc-0446-4ce3-a601-3fb687465161\") " pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:23 crc kubenswrapper[4935]: E1217 09:19:23.863423 4935 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 17 09:19:23 crc kubenswrapper[4935]: E1217 09:19:23.863543 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-memberlist podName:137edfb5-6e98-4aef-8a75-bf14297a7d3d nodeName:}" failed. No retries permitted until 2025-12-17 09:19:24.363512719 +0000 UTC m=+884.023353662 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-memberlist") pod "speaker-z8lgk" (UID: "137edfb5-6e98-4aef-8a75-bf14297a7d3d") : secret "metallb-memberlist" not found Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.863272 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmckc\" (UniqueName: \"kubernetes.io/projected/137edfb5-6e98-4aef-8a75-bf14297a7d3d-kube-api-access-tmckc\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.865122 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-metrics-certs\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.865161 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfxjh\" (UniqueName: \"kubernetes.io/projected/30c162bc-0446-4ce3-a601-3fb687465161-kube-api-access-zfxjh\") pod \"controller-5bddd4b946-sg456\" (UID: \"30c162bc-0446-4ce3-a601-3fb687465161\") " pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:23 crc kubenswrapper[4935]: E1217 09:19:23.865185 4935 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.865197 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c162bc-0446-4ce3-a601-3fb687465161-cert\") pod \"controller-5bddd4b946-sg456\" (UID: \"30c162bc-0446-4ce3-a601-3fb687465161\") " pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:23 crc kubenswrapper[4935]: E1217 09:19:23.865220 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-metrics-certs podName:137edfb5-6e98-4aef-8a75-bf14297a7d3d nodeName:}" failed. No retries permitted until 2025-12-17 09:19:24.365210501 +0000 UTC m=+884.025051264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-metrics-certs") pod "speaker-z8lgk" (UID: "137edfb5-6e98-4aef-8a75-bf14297a7d3d") : secret "speaker-certs-secret" not found Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.865248 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/137edfb5-6e98-4aef-8a75-bf14297a7d3d-metallb-excludel2\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.866175 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/137edfb5-6e98-4aef-8a75-bf14297a7d3d-metallb-excludel2\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.871032 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/30c162bc-0446-4ce3-a601-3fb687465161-metrics-certs\") pod \"controller-5bddd4b946-sg456\" (UID: \"30c162bc-0446-4ce3-a601-3fb687465161\") " pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.886798 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/30c162bc-0446-4ce3-a601-3fb687465161-cert\") pod \"controller-5bddd4b946-sg456\" (UID: \"30c162bc-0446-4ce3-a601-3fb687465161\") " pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.917112 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfxjh\" (UniqueName: \"kubernetes.io/projected/30c162bc-0446-4ce3-a601-3fb687465161-kube-api-access-zfxjh\") pod \"controller-5bddd4b946-sg456\" (UID: \"30c162bc-0446-4ce3-a601-3fb687465161\") " pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.918188 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmckc\" (UniqueName: \"kubernetes.io/projected/137edfb5-6e98-4aef-8a75-bf14297a7d3d-kube-api-access-tmckc\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:23 crc kubenswrapper[4935]: I1217 09:19:23.944892 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:24 crc kubenswrapper[4935]: I1217 09:19:24.209823 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k268q" event={"ID":"88717473-e7d0-4a23-b03b-5cada6284ad1","Type":"ContainerStarted","Data":"c29f24f75268673c7d4e4dd3557a9e1ed0f11002606911cc95b15c63d574ceff"} Dec 17 09:19:24 crc kubenswrapper[4935]: I1217 09:19:24.241683 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92"] Dec 17 09:19:24 crc kubenswrapper[4935]: W1217 09:19:24.250713 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4331fce4_cd29_4cfc_90c0_45a97c6596a4.slice/crio-23e7f64d0dfb83a889432d05ddad606b414fd5616235f3a7dbd6812ecc7307c0 WatchSource:0}: Error finding container 23e7f64d0dfb83a889432d05ddad606b414fd5616235f3a7dbd6812ecc7307c0: Status 404 returned error can't find the container with id 23e7f64d0dfb83a889432d05ddad606b414fd5616235f3a7dbd6812ecc7307c0 Dec 17 09:19:24 crc kubenswrapper[4935]: I1217 09:19:24.275140 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5bddd4b946-sg456"] Dec 17 09:19:24 crc kubenswrapper[4935]: W1217 09:19:24.283719 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30c162bc_0446_4ce3_a601_3fb687465161.slice/crio-ddeb9c297285340a9c1667fde0fee260095203aaa83cacec6e9e2fce80ddf23c WatchSource:0}: Error finding container ddeb9c297285340a9c1667fde0fee260095203aaa83cacec6e9e2fce80ddf23c: Status 404 returned error can't find the container with id ddeb9c297285340a9c1667fde0fee260095203aaa83cacec6e9e2fce80ddf23c Dec 17 09:19:24 crc kubenswrapper[4935]: I1217 09:19:24.377920 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-memberlist\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:24 crc kubenswrapper[4935]: I1217 09:19:24.377979 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-metrics-certs\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:24 crc kubenswrapper[4935]: E1217 09:19:24.378708 4935 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 17 09:19:24 crc kubenswrapper[4935]: E1217 09:19:24.378822 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-memberlist podName:137edfb5-6e98-4aef-8a75-bf14297a7d3d nodeName:}" failed. No retries permitted until 2025-12-17 09:19:25.378783948 +0000 UTC m=+885.038624701 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-memberlist") pod "speaker-z8lgk" (UID: "137edfb5-6e98-4aef-8a75-bf14297a7d3d") : secret "metallb-memberlist" not found Dec 17 09:19:24 crc kubenswrapper[4935]: I1217 09:19:24.387637 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-metrics-certs\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:25 crc kubenswrapper[4935]: I1217 09:19:25.217757 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" event={"ID":"4331fce4-cd29-4cfc-90c0-45a97c6596a4","Type":"ContainerStarted","Data":"23e7f64d0dfb83a889432d05ddad606b414fd5616235f3a7dbd6812ecc7307c0"} Dec 17 09:19:25 crc kubenswrapper[4935]: I1217 09:19:25.220083 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-sg456" event={"ID":"30c162bc-0446-4ce3-a601-3fb687465161","Type":"ContainerStarted","Data":"1464539dbfc4df89d24fe7906ca0c5b0810d3caec997b95c2a741098e3fa6736"} Dec 17 09:19:25 crc kubenswrapper[4935]: I1217 09:19:25.220118 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-sg456" event={"ID":"30c162bc-0446-4ce3-a601-3fb687465161","Type":"ContainerStarted","Data":"ffb11c972beb8f0ba7f2d2a6f4aa2cc16af2d8ac5927ffe66d9f1be69b7430fe"} Dec 17 09:19:25 crc kubenswrapper[4935]: I1217 09:19:25.220134 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5bddd4b946-sg456" event={"ID":"30c162bc-0446-4ce3-a601-3fb687465161","Type":"ContainerStarted","Data":"ddeb9c297285340a9c1667fde0fee260095203aaa83cacec6e9e2fce80ddf23c"} Dec 17 09:19:25 crc kubenswrapper[4935]: I1217 09:19:25.220321 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:25 crc kubenswrapper[4935]: I1217 09:19:25.247800 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5bddd4b946-sg456" podStartSLOduration=2.247780272 podStartE2EDuration="2.247780272s" podCreationTimestamp="2025-12-17 09:19:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:19:25.242668377 +0000 UTC m=+884.902509140" watchObservedRunningTime="2025-12-17 09:19:25.247780272 +0000 UTC m=+884.907621035" Dec 17 09:19:25 crc kubenswrapper[4935]: I1217 09:19:25.395648 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-memberlist\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:25 crc kubenswrapper[4935]: I1217 09:19:25.401480 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/137edfb5-6e98-4aef-8a75-bf14297a7d3d-memberlist\") pod \"speaker-z8lgk\" (UID: \"137edfb5-6e98-4aef-8a75-bf14297a7d3d\") " pod="metallb-system/speaker-z8lgk" Dec 17 09:19:25 crc kubenswrapper[4935]: I1217 09:19:25.417627 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-z8lgk" Dec 17 09:19:26 crc kubenswrapper[4935]: I1217 09:19:26.255805 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z8lgk" event={"ID":"137edfb5-6e98-4aef-8a75-bf14297a7d3d","Type":"ContainerStarted","Data":"f42cd393571d94f2caa806f85f97f11ccc41f1ebc380e84571ffec5684995cf5"} Dec 17 09:19:26 crc kubenswrapper[4935]: I1217 09:19:26.256181 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z8lgk" event={"ID":"137edfb5-6e98-4aef-8a75-bf14297a7d3d","Type":"ContainerStarted","Data":"3c3a9c765fe6df72e21221c283cb77ec68772d5b27091296857e44b6f1880b2b"} Dec 17 09:19:26 crc kubenswrapper[4935]: I1217 09:19:26.256200 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-z8lgk" event={"ID":"137edfb5-6e98-4aef-8a75-bf14297a7d3d","Type":"ContainerStarted","Data":"afd2ee2d4c7080161db292c39ce679217147691ac70227d01c88a6ea3dbd12c0"} Dec 17 09:19:26 crc kubenswrapper[4935]: I1217 09:19:26.256963 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-z8lgk" Dec 17 09:19:26 crc kubenswrapper[4935]: I1217 09:19:26.303731 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-z8lgk" podStartSLOduration=3.30371312 podStartE2EDuration="3.30371312s" podCreationTimestamp="2025-12-17 09:19:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:19:26.289250317 +0000 UTC m=+885.949091080" watchObservedRunningTime="2025-12-17 09:19:26.30371312 +0000 UTC m=+885.963553883" Dec 17 09:19:33 crc kubenswrapper[4935]: I1217 09:19:33.298765 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" event={"ID":"4331fce4-cd29-4cfc-90c0-45a97c6596a4","Type":"ContainerStarted","Data":"74c21d72b4bf0ede609f1eaf376b1e7347eb5f2ae96a2ae4f19b4658272eba56"} Dec 17 09:19:33 crc kubenswrapper[4935]: I1217 09:19:33.299621 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" Dec 17 09:19:33 crc kubenswrapper[4935]: I1217 09:19:33.301078 4935 generic.go:334] "Generic (PLEG): container finished" podID="88717473-e7d0-4a23-b03b-5cada6284ad1" containerID="75ded70992b2f0c657a1789370d7aca06c177500d32a4e9e209707bc43d127c9" exitCode=0 Dec 17 09:19:33 crc kubenswrapper[4935]: I1217 09:19:33.301130 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k268q" event={"ID":"88717473-e7d0-4a23-b03b-5cada6284ad1","Type":"ContainerDied","Data":"75ded70992b2f0c657a1789370d7aca06c177500d32a4e9e209707bc43d127c9"} Dec 17 09:19:33 crc kubenswrapper[4935]: I1217 09:19:33.321041 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" podStartSLOduration=1.9482513099999998 podStartE2EDuration="10.321012576s" podCreationTimestamp="2025-12-17 09:19:23 +0000 UTC" firstStartedPulling="2025-12-17 09:19:24.254390061 +0000 UTC m=+883.914230824" lastFinishedPulling="2025-12-17 09:19:32.627151327 +0000 UTC m=+892.286992090" observedRunningTime="2025-12-17 09:19:33.318494185 +0000 UTC m=+892.978334948" watchObservedRunningTime="2025-12-17 09:19:33.321012576 +0000 UTC m=+892.980853349" Dec 17 09:19:34 crc kubenswrapper[4935]: I1217 09:19:34.311265 4935 generic.go:334] "Generic (PLEG): container finished" podID="88717473-e7d0-4a23-b03b-5cada6284ad1" containerID="a63f58499b1b51e46a1cfea28ad0cf2ce38a0c225b0c93d18db896ee28b901fa" exitCode=0 Dec 17 09:19:34 crc kubenswrapper[4935]: I1217 09:19:34.311396 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k268q" event={"ID":"88717473-e7d0-4a23-b03b-5cada6284ad1","Type":"ContainerDied","Data":"a63f58499b1b51e46a1cfea28ad0cf2ce38a0c225b0c93d18db896ee28b901fa"} Dec 17 09:19:35 crc kubenswrapper[4935]: I1217 09:19:35.321444 4935 generic.go:334] "Generic (PLEG): container finished" podID="88717473-e7d0-4a23-b03b-5cada6284ad1" containerID="7cfa854cf9cc070a2e93b44dbc580fb7f00f64ca39f41c8ec82134248ee5535c" exitCode=0 Dec 17 09:19:35 crc kubenswrapper[4935]: I1217 09:19:35.321516 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k268q" event={"ID":"88717473-e7d0-4a23-b03b-5cada6284ad1","Type":"ContainerDied","Data":"7cfa854cf9cc070a2e93b44dbc580fb7f00f64ca39f41c8ec82134248ee5535c"} Dec 17 09:19:35 crc kubenswrapper[4935]: I1217 09:19:35.421977 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-z8lgk" Dec 17 09:19:36 crc kubenswrapper[4935]: I1217 09:19:36.333748 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k268q" event={"ID":"88717473-e7d0-4a23-b03b-5cada6284ad1","Type":"ContainerStarted","Data":"d037950986b8f7c5af6738766c77e4b7703d97714cb3dccd674d9d2b2c4e84f4"} Dec 17 09:19:36 crc kubenswrapper[4935]: I1217 09:19:36.334443 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k268q" event={"ID":"88717473-e7d0-4a23-b03b-5cada6284ad1","Type":"ContainerStarted","Data":"f27b7ff104a057a20535988396b5f86699905bd18525e5e80c1925c3f0428776"} Dec 17 09:19:36 crc kubenswrapper[4935]: I1217 09:19:36.334458 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k268q" event={"ID":"88717473-e7d0-4a23-b03b-5cada6284ad1","Type":"ContainerStarted","Data":"7daa73f2d665aa06daf525372b79088b79059a33c936ebc8d7eaf9f8d7dd8359"} Dec 17 09:19:36 crc kubenswrapper[4935]: I1217 09:19:36.334470 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k268q" event={"ID":"88717473-e7d0-4a23-b03b-5cada6284ad1","Type":"ContainerStarted","Data":"a9cdc3a59a077da024e794125ff8a9802fe8e20a8a3192f114aef773e3371e4e"} Dec 17 09:19:36 crc kubenswrapper[4935]: I1217 09:19:36.334487 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k268q" event={"ID":"88717473-e7d0-4a23-b03b-5cada6284ad1","Type":"ContainerStarted","Data":"2dc94ad0a947de0ef3a6a2277eda77fd2aa058e87aa2f667e3eb9027f5671ee6"} Dec 17 09:19:37 crc kubenswrapper[4935]: I1217 09:19:37.345042 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-k268q" event={"ID":"88717473-e7d0-4a23-b03b-5cada6284ad1","Type":"ContainerStarted","Data":"3547b52c84818c43b0f92b6e6efd3f7204af498ce283892758d671f54e67b258"} Dec 17 09:19:37 crc kubenswrapper[4935]: I1217 09:19:37.345184 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:37 crc kubenswrapper[4935]: I1217 09:19:37.370116 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-k268q" podStartSLOduration=5.757103492 podStartE2EDuration="14.370096902s" podCreationTimestamp="2025-12-17 09:19:23 +0000 UTC" firstStartedPulling="2025-12-17 09:19:24.035365755 +0000 UTC m=+883.695206518" lastFinishedPulling="2025-12-17 09:19:32.648359165 +0000 UTC m=+892.308199928" observedRunningTime="2025-12-17 09:19:37.367738464 +0000 UTC m=+897.027579237" watchObservedRunningTime="2025-12-17 09:19:37.370096902 +0000 UTC m=+897.029937665" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.306820 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-scbb4"] Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.307735 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-scbb4" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.311530 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.311613 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-6k2jc" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.312430 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.326084 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-scbb4"] Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.435778 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v6vj\" (UniqueName: \"kubernetes.io/projected/30d5073f-a077-4a04-9d7a-90f3b7ca9c15-kube-api-access-9v6vj\") pod \"openstack-operator-index-scbb4\" (UID: \"30d5073f-a077-4a04-9d7a-90f3b7ca9c15\") " pod="openstack-operators/openstack-operator-index-scbb4" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.537179 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v6vj\" (UniqueName: \"kubernetes.io/projected/30d5073f-a077-4a04-9d7a-90f3b7ca9c15-kube-api-access-9v6vj\") pod \"openstack-operator-index-scbb4\" (UID: \"30d5073f-a077-4a04-9d7a-90f3b7ca9c15\") " pod="openstack-operators/openstack-operator-index-scbb4" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.562970 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v6vj\" (UniqueName: \"kubernetes.io/projected/30d5073f-a077-4a04-9d7a-90f3b7ca9c15-kube-api-access-9v6vj\") pod \"openstack-operator-index-scbb4\" (UID: \"30d5073f-a077-4a04-9d7a-90f3b7ca9c15\") " pod="openstack-operators/openstack-operator-index-scbb4" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.634907 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-scbb4" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.837961 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.879567 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:38 crc kubenswrapper[4935]: I1217 09:19:38.964609 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-scbb4"] Dec 17 09:19:39 crc kubenswrapper[4935]: I1217 09:19:39.359618 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-scbb4" event={"ID":"30d5073f-a077-4a04-9d7a-90f3b7ca9c15","Type":"ContainerStarted","Data":"dbbb43d8c858376a0c4f83cf6ff714efbfa939d0b1e803ed78ef1467fa0c5675"} Dec 17 09:19:41 crc kubenswrapper[4935]: I1217 09:19:41.375182 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-scbb4" event={"ID":"30d5073f-a077-4a04-9d7a-90f3b7ca9c15","Type":"ContainerStarted","Data":"dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270"} Dec 17 09:19:41 crc kubenswrapper[4935]: I1217 09:19:41.402607 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-scbb4" podStartSLOduration=1.728345911 podStartE2EDuration="3.402580223s" podCreationTimestamp="2025-12-17 09:19:38 +0000 UTC" firstStartedPulling="2025-12-17 09:19:38.974090508 +0000 UTC m=+898.633931271" lastFinishedPulling="2025-12-17 09:19:40.64832482 +0000 UTC m=+900.308165583" observedRunningTime="2025-12-17 09:19:41.398496173 +0000 UTC m=+901.058336936" watchObservedRunningTime="2025-12-17 09:19:41.402580223 +0000 UTC m=+901.062421016" Dec 17 09:19:41 crc kubenswrapper[4935]: I1217 09:19:41.684835 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-scbb4"] Dec 17 09:19:42 crc kubenswrapper[4935]: I1217 09:19:42.295682 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fb28k"] Dec 17 09:19:42 crc kubenswrapper[4935]: I1217 09:19:42.296566 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fb28k" Dec 17 09:19:42 crc kubenswrapper[4935]: I1217 09:19:42.310766 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fb28k"] Dec 17 09:19:42 crc kubenswrapper[4935]: I1217 09:19:42.399382 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvpt7\" (UniqueName: \"kubernetes.io/projected/9a5f90ee-af6e-42ff-94b9-87b969461bee-kube-api-access-kvpt7\") pod \"openstack-operator-index-fb28k\" (UID: \"9a5f90ee-af6e-42ff-94b9-87b969461bee\") " pod="openstack-operators/openstack-operator-index-fb28k" Dec 17 09:19:42 crc kubenswrapper[4935]: I1217 09:19:42.500862 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpt7\" (UniqueName: \"kubernetes.io/projected/9a5f90ee-af6e-42ff-94b9-87b969461bee-kube-api-access-kvpt7\") pod \"openstack-operator-index-fb28k\" (UID: \"9a5f90ee-af6e-42ff-94b9-87b969461bee\") " pod="openstack-operators/openstack-operator-index-fb28k" Dec 17 09:19:42 crc kubenswrapper[4935]: I1217 09:19:42.520398 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvpt7\" (UniqueName: \"kubernetes.io/projected/9a5f90ee-af6e-42ff-94b9-87b969461bee-kube-api-access-kvpt7\") pod \"openstack-operator-index-fb28k\" (UID: \"9a5f90ee-af6e-42ff-94b9-87b969461bee\") " pod="openstack-operators/openstack-operator-index-fb28k" Dec 17 09:19:42 crc kubenswrapper[4935]: I1217 09:19:42.641258 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fb28k" Dec 17 09:19:43 crc kubenswrapper[4935]: I1217 09:19:43.134650 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fb28k"] Dec 17 09:19:43 crc kubenswrapper[4935]: W1217 09:19:43.145603 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a5f90ee_af6e_42ff_94b9_87b969461bee.slice/crio-5a18cce52f477c0cbb0cb63485e57bc8fe0ce2af53d6c52e75949474526b1542 WatchSource:0}: Error finding container 5a18cce52f477c0cbb0cb63485e57bc8fe0ce2af53d6c52e75949474526b1542: Status 404 returned error can't find the container with id 5a18cce52f477c0cbb0cb63485e57bc8fe0ce2af53d6c52e75949474526b1542 Dec 17 09:19:43 crc kubenswrapper[4935]: I1217 09:19:43.387515 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fb28k" event={"ID":"9a5f90ee-af6e-42ff-94b9-87b969461bee","Type":"ContainerStarted","Data":"5a18cce52f477c0cbb0cb63485e57bc8fe0ce2af53d6c52e75949474526b1542"} Dec 17 09:19:43 crc kubenswrapper[4935]: I1217 09:19:43.387670 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-scbb4" podUID="30d5073f-a077-4a04-9d7a-90f3b7ca9c15" containerName="registry-server" containerID="cri-o://dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270" gracePeriod=2 Dec 17 09:19:43 crc kubenswrapper[4935]: I1217 09:19:43.742594 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-scbb4" Dec 17 09:19:43 crc kubenswrapper[4935]: I1217 09:19:43.817757 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v6vj\" (UniqueName: \"kubernetes.io/projected/30d5073f-a077-4a04-9d7a-90f3b7ca9c15-kube-api-access-9v6vj\") pod \"30d5073f-a077-4a04-9d7a-90f3b7ca9c15\" (UID: \"30d5073f-a077-4a04-9d7a-90f3b7ca9c15\") " Dec 17 09:19:43 crc kubenswrapper[4935]: I1217 09:19:43.822917 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d5073f-a077-4a04-9d7a-90f3b7ca9c15-kube-api-access-9v6vj" (OuterVolumeSpecName: "kube-api-access-9v6vj") pod "30d5073f-a077-4a04-9d7a-90f3b7ca9c15" (UID: "30d5073f-a077-4a04-9d7a-90f3b7ca9c15"). InnerVolumeSpecName "kube-api-access-9v6vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:19:43 crc kubenswrapper[4935]: I1217 09:19:43.836645 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7784b6fcf-7pk92" Dec 17 09:19:43 crc kubenswrapper[4935]: I1217 09:19:43.919479 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v6vj\" (UniqueName: \"kubernetes.io/projected/30d5073f-a077-4a04-9d7a-90f3b7ca9c15-kube-api-access-9v6vj\") on node \"crc\" DevicePath \"\"" Dec 17 09:19:43 crc kubenswrapper[4935]: I1217 09:19:43.949257 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5bddd4b946-sg456" Dec 17 09:19:44 crc kubenswrapper[4935]: I1217 09:19:44.395317 4935 generic.go:334] "Generic (PLEG): container finished" podID="30d5073f-a077-4a04-9d7a-90f3b7ca9c15" containerID="dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270" exitCode=0 Dec 17 09:19:44 crc kubenswrapper[4935]: I1217 09:19:44.395360 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-scbb4" event={"ID":"30d5073f-a077-4a04-9d7a-90f3b7ca9c15","Type":"ContainerDied","Data":"dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270"} Dec 17 09:19:44 crc kubenswrapper[4935]: I1217 09:19:44.395387 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-scbb4" event={"ID":"30d5073f-a077-4a04-9d7a-90f3b7ca9c15","Type":"ContainerDied","Data":"dbbb43d8c858376a0c4f83cf6ff714efbfa939d0b1e803ed78ef1467fa0c5675"} Dec 17 09:19:44 crc kubenswrapper[4935]: I1217 09:19:44.395404 4935 scope.go:117] "RemoveContainer" containerID="dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270" Dec 17 09:19:44 crc kubenswrapper[4935]: I1217 09:19:44.395502 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-scbb4" Dec 17 09:19:44 crc kubenswrapper[4935]: I1217 09:19:44.425726 4935 scope.go:117] "RemoveContainer" containerID="dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270" Dec 17 09:19:44 crc kubenswrapper[4935]: E1217 09:19:44.428569 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270\": container with ID starting with dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270 not found: ID does not exist" containerID="dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270" Dec 17 09:19:44 crc kubenswrapper[4935]: I1217 09:19:44.428736 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270"} err="failed to get container status \"dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270\": rpc error: code = NotFound desc = could not find container \"dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270\": container with ID starting with dd286b5c39f3f23d8c85da03b500186bf9ca9316d54db88250fad4519cab0270 not found: ID does not exist" Dec 17 09:19:44 crc kubenswrapper[4935]: I1217 09:19:44.436492 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-scbb4"] Dec 17 09:19:44 crc kubenswrapper[4935]: I1217 09:19:44.440594 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-scbb4"] Dec 17 09:19:45 crc kubenswrapper[4935]: I1217 09:19:45.133087 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d5073f-a077-4a04-9d7a-90f3b7ca9c15" path="/var/lib/kubelet/pods/30d5073f-a077-4a04-9d7a-90f3b7ca9c15/volumes" Dec 17 09:19:45 crc kubenswrapper[4935]: I1217 09:19:45.405752 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fb28k" event={"ID":"9a5f90ee-af6e-42ff-94b9-87b969461bee","Type":"ContainerStarted","Data":"79ae376d033d047a5ed7febebfe2e9f66736114f849177df5cbb9d1023b41cb0"} Dec 17 09:19:45 crc kubenswrapper[4935]: I1217 09:19:45.428994 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fb28k" podStartSLOduration=1.949461077 podStartE2EDuration="3.428974324s" podCreationTimestamp="2025-12-17 09:19:42 +0000 UTC" firstStartedPulling="2025-12-17 09:19:43.148155516 +0000 UTC m=+902.807996289" lastFinishedPulling="2025-12-17 09:19:44.627668773 +0000 UTC m=+904.287509536" observedRunningTime="2025-12-17 09:19:45.423257124 +0000 UTC m=+905.083097887" watchObservedRunningTime="2025-12-17 09:19:45.428974324 +0000 UTC m=+905.088815087" Dec 17 09:19:52 crc kubenswrapper[4935]: I1217 09:19:52.642298 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-fb28k" Dec 17 09:19:52 crc kubenswrapper[4935]: I1217 09:19:52.643044 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-fb28k" Dec 17 09:19:52 crc kubenswrapper[4935]: I1217 09:19:52.680891 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-fb28k" Dec 17 09:19:53 crc kubenswrapper[4935]: I1217 09:19:53.504301 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-fb28k" Dec 17 09:19:53 crc kubenswrapper[4935]: I1217 09:19:53.841724 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-k268q" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.695955 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ctbzf"] Dec 17 09:19:55 crc kubenswrapper[4935]: E1217 09:19:55.697477 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d5073f-a077-4a04-9d7a-90f3b7ca9c15" containerName="registry-server" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.697495 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d5073f-a077-4a04-9d7a-90f3b7ca9c15" containerName="registry-server" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.697616 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d5073f-a077-4a04-9d7a-90f3b7ca9c15" containerName="registry-server" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.698592 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.713963 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ctbzf"] Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.795078 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-catalog-content\") pod \"community-operators-ctbzf\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.795175 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84wtq\" (UniqueName: \"kubernetes.io/projected/02c86c70-e631-432e-8555-c769bf3e94a8-kube-api-access-84wtq\") pod \"community-operators-ctbzf\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.795216 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-utilities\") pod \"community-operators-ctbzf\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.896425 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-utilities\") pod \"community-operators-ctbzf\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.896487 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-catalog-content\") pod \"community-operators-ctbzf\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.896558 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84wtq\" (UniqueName: \"kubernetes.io/projected/02c86c70-e631-432e-8555-c769bf3e94a8-kube-api-access-84wtq\") pod \"community-operators-ctbzf\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.897240 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-utilities\") pod \"community-operators-ctbzf\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.897249 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-catalog-content\") pod \"community-operators-ctbzf\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:55 crc kubenswrapper[4935]: I1217 09:19:55.916205 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84wtq\" (UniqueName: \"kubernetes.io/projected/02c86c70-e631-432e-8555-c769bf3e94a8-kube-api-access-84wtq\") pod \"community-operators-ctbzf\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:56 crc kubenswrapper[4935]: I1217 09:19:56.019599 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:19:56 crc kubenswrapper[4935]: I1217 09:19:56.346445 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ctbzf"] Dec 17 09:19:56 crc kubenswrapper[4935]: W1217 09:19:56.355907 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02c86c70_e631_432e_8555_c769bf3e94a8.slice/crio-b87a3a979487bbdbafd5a10b10bb773ade6cc4e793fb027c20cf35909f2f9c4e WatchSource:0}: Error finding container b87a3a979487bbdbafd5a10b10bb773ade6cc4e793fb027c20cf35909f2f9c4e: Status 404 returned error can't find the container with id b87a3a979487bbdbafd5a10b10bb773ade6cc4e793fb027c20cf35909f2f9c4e Dec 17 09:19:56 crc kubenswrapper[4935]: I1217 09:19:56.484724 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctbzf" event={"ID":"02c86c70-e631-432e-8555-c769bf3e94a8","Type":"ContainerStarted","Data":"b87a3a979487bbdbafd5a10b10bb773ade6cc4e793fb027c20cf35909f2f9c4e"} Dec 17 09:19:57 crc kubenswrapper[4935]: I1217 09:19:57.494712 4935 generic.go:334] "Generic (PLEG): container finished" podID="02c86c70-e631-432e-8555-c769bf3e94a8" containerID="7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f" exitCode=0 Dec 17 09:19:57 crc kubenswrapper[4935]: I1217 09:19:57.495132 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctbzf" event={"ID":"02c86c70-e631-432e-8555-c769bf3e94a8","Type":"ContainerDied","Data":"7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f"} Dec 17 09:19:58 crc kubenswrapper[4935]: I1217 09:19:58.509665 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctbzf" event={"ID":"02c86c70-e631-432e-8555-c769bf3e94a8","Type":"ContainerStarted","Data":"f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7"} Dec 17 09:19:59 crc kubenswrapper[4935]: I1217 09:19:59.518321 4935 generic.go:334] "Generic (PLEG): container finished" podID="02c86c70-e631-432e-8555-c769bf3e94a8" containerID="f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7" exitCode=0 Dec 17 09:19:59 crc kubenswrapper[4935]: I1217 09:19:59.518377 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctbzf" event={"ID":"02c86c70-e631-432e-8555-c769bf3e94a8","Type":"ContainerDied","Data":"f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7"} Dec 17 09:20:00 crc kubenswrapper[4935]: I1217 09:20:00.130687 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:20:00 crc kubenswrapper[4935]: I1217 09:20:00.130768 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.142733 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2"] Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.144718 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.147659 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-7tmdv" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.156100 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2"] Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.308484 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-util\") pod \"66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.308550 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-bundle\") pod \"66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.308611 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzbqx\" (UniqueName: \"kubernetes.io/projected/ee37f41b-412b-4273-959c-099aeb26681f-kube-api-access-wzbqx\") pod \"66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.410677 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-util\") pod \"66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.410743 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-bundle\") pod \"66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.410813 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzbqx\" (UniqueName: \"kubernetes.io/projected/ee37f41b-412b-4273-959c-099aeb26681f-kube-api-access-wzbqx\") pod \"66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.411392 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-util\") pod \"66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.411491 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-bundle\") pod \"66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.432995 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzbqx\" (UniqueName: \"kubernetes.io/projected/ee37f41b-412b-4273-959c-099aeb26681f-kube-api-access-wzbqx\") pod \"66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.462307 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.554698 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctbzf" event={"ID":"02c86c70-e631-432e-8555-c769bf3e94a8","Type":"ContainerStarted","Data":"a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779"} Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.580830 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ctbzf" podStartSLOduration=3.248485938 podStartE2EDuration="6.580809888s" podCreationTimestamp="2025-12-17 09:19:55 +0000 UTC" firstStartedPulling="2025-12-17 09:19:57.497179558 +0000 UTC m=+917.157020341" lastFinishedPulling="2025-12-17 09:20:00.829503528 +0000 UTC m=+920.489344291" observedRunningTime="2025-12-17 09:20:01.576865542 +0000 UTC m=+921.236706305" watchObservedRunningTime="2025-12-17 09:20:01.580809888 +0000 UTC m=+921.240650651" Dec 17 09:20:01 crc kubenswrapper[4935]: I1217 09:20:01.938432 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2"] Dec 17 09:20:02 crc kubenswrapper[4935]: E1217 09:20:02.519533 4935 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee37f41b_412b_4273_959c_099aeb26681f.slice/crio-1bca80a6c358e1d7483670ba5e553255a741d6ce5e56ae233d0ebcc9672414c2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee37f41b_412b_4273_959c_099aeb26681f.slice/crio-conmon-1bca80a6c358e1d7483670ba5e553255a741d6ce5e56ae233d0ebcc9672414c2.scope\": RecentStats: unable to find data in memory cache]" Dec 17 09:20:02 crc kubenswrapper[4935]: I1217 09:20:02.560487 4935 generic.go:334] "Generic (PLEG): container finished" podID="ee37f41b-412b-4273-959c-099aeb26681f" containerID="1bca80a6c358e1d7483670ba5e553255a741d6ce5e56ae233d0ebcc9672414c2" exitCode=0 Dec 17 09:20:02 crc kubenswrapper[4935]: I1217 09:20:02.561912 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" event={"ID":"ee37f41b-412b-4273-959c-099aeb26681f","Type":"ContainerDied","Data":"1bca80a6c358e1d7483670ba5e553255a741d6ce5e56ae233d0ebcc9672414c2"} Dec 17 09:20:02 crc kubenswrapper[4935]: I1217 09:20:02.561955 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" event={"ID":"ee37f41b-412b-4273-959c-099aeb26681f","Type":"ContainerStarted","Data":"d03989ca073c803d5802d7e91e4258a7251bc4b97c29ae390f71acf08082498c"} Dec 17 09:20:04 crc kubenswrapper[4935]: I1217 09:20:04.575943 4935 generic.go:334] "Generic (PLEG): container finished" podID="ee37f41b-412b-4273-959c-099aeb26681f" containerID="917c85460e3046dcb3e552f615ced49d8e21b6997e935f72103223a7fee81bae" exitCode=0 Dec 17 09:20:04 crc kubenswrapper[4935]: I1217 09:20:04.576046 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" event={"ID":"ee37f41b-412b-4273-959c-099aeb26681f","Type":"ContainerDied","Data":"917c85460e3046dcb3e552f615ced49d8e21b6997e935f72103223a7fee81bae"} Dec 17 09:20:05 crc kubenswrapper[4935]: I1217 09:20:05.586083 4935 generic.go:334] "Generic (PLEG): container finished" podID="ee37f41b-412b-4273-959c-099aeb26681f" containerID="b0594a7575f22106edceec1b1552ea8fe7de7b797f01a57fa62929dc0da4d7a2" exitCode=0 Dec 17 09:20:05 crc kubenswrapper[4935]: I1217 09:20:05.586145 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" event={"ID":"ee37f41b-412b-4273-959c-099aeb26681f","Type":"ContainerDied","Data":"b0594a7575f22106edceec1b1552ea8fe7de7b797f01a57fa62929dc0da4d7a2"} Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.019709 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.020104 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.086869 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.297569 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tc5dx"] Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.317245 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.326218 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-utilities\") pod \"redhat-marketplace-tc5dx\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.326292 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nbgj\" (UniqueName: \"kubernetes.io/projected/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-kube-api-access-5nbgj\") pod \"redhat-marketplace-tc5dx\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.326317 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-catalog-content\") pod \"redhat-marketplace-tc5dx\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.333223 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tc5dx"] Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.427640 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-utilities\") pod \"redhat-marketplace-tc5dx\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.427737 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nbgj\" (UniqueName: \"kubernetes.io/projected/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-kube-api-access-5nbgj\") pod \"redhat-marketplace-tc5dx\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.427771 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-catalog-content\") pod \"redhat-marketplace-tc5dx\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.428265 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-utilities\") pod \"redhat-marketplace-tc5dx\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.428561 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-catalog-content\") pod \"redhat-marketplace-tc5dx\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.453601 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nbgj\" (UniqueName: \"kubernetes.io/projected/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-kube-api-access-5nbgj\") pod \"redhat-marketplace-tc5dx\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.637786 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.648021 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.897861 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:06 crc kubenswrapper[4935]: I1217 09:20:06.968358 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tc5dx"] Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.034809 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzbqx\" (UniqueName: \"kubernetes.io/projected/ee37f41b-412b-4273-959c-099aeb26681f-kube-api-access-wzbqx\") pod \"ee37f41b-412b-4273-959c-099aeb26681f\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.035493 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-bundle\") pod \"ee37f41b-412b-4273-959c-099aeb26681f\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.035545 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-util\") pod \"ee37f41b-412b-4273-959c-099aeb26681f\" (UID: \"ee37f41b-412b-4273-959c-099aeb26681f\") " Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.036068 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-bundle" (OuterVolumeSpecName: "bundle") pod "ee37f41b-412b-4273-959c-099aeb26681f" (UID: "ee37f41b-412b-4273-959c-099aeb26681f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.041863 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee37f41b-412b-4273-959c-099aeb26681f-kube-api-access-wzbqx" (OuterVolumeSpecName: "kube-api-access-wzbqx") pod "ee37f41b-412b-4273-959c-099aeb26681f" (UID: "ee37f41b-412b-4273-959c-099aeb26681f"). InnerVolumeSpecName "kube-api-access-wzbqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.054585 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-util" (OuterVolumeSpecName: "util") pod "ee37f41b-412b-4273-959c-099aeb26681f" (UID: "ee37f41b-412b-4273-959c-099aeb26681f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.137604 4935 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.137637 4935 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ee37f41b-412b-4273-959c-099aeb26681f-util\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.137650 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzbqx\" (UniqueName: \"kubernetes.io/projected/ee37f41b-412b-4273-959c-099aeb26681f-kube-api-access-wzbqx\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.603242 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" event={"ID":"ee37f41b-412b-4273-959c-099aeb26681f","Type":"ContainerDied","Data":"d03989ca073c803d5802d7e91e4258a7251bc4b97c29ae390f71acf08082498c"} Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.603308 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2" Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.603324 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d03989ca073c803d5802d7e91e4258a7251bc4b97c29ae390f71acf08082498c" Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.605657 4935 generic.go:334] "Generic (PLEG): container finished" podID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerID="f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1" exitCode=0 Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.605809 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tc5dx" event={"ID":"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b","Type":"ContainerDied","Data":"f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1"} Dec 17 09:20:07 crc kubenswrapper[4935]: I1217 09:20:07.605865 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tc5dx" event={"ID":"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b","Type":"ContainerStarted","Data":"2a230e5bee40c4e9cceec9e323dfe5cd036ec81678eb3bd9903455dd1a29dfb5"} Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.098805 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7ggmg"] Dec 17 09:20:09 crc kubenswrapper[4935]: E1217 09:20:09.099567 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee37f41b-412b-4273-959c-099aeb26681f" containerName="pull" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.099583 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee37f41b-412b-4273-959c-099aeb26681f" containerName="pull" Dec 17 09:20:09 crc kubenswrapper[4935]: E1217 09:20:09.099602 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee37f41b-412b-4273-959c-099aeb26681f" containerName="util" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.099608 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee37f41b-412b-4273-959c-099aeb26681f" containerName="util" Dec 17 09:20:09 crc kubenswrapper[4935]: E1217 09:20:09.099617 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee37f41b-412b-4273-959c-099aeb26681f" containerName="extract" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.099625 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee37f41b-412b-4273-959c-099aeb26681f" containerName="extract" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.099775 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee37f41b-412b-4273-959c-099aeb26681f" containerName="extract" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.100670 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.112834 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7ggmg"] Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.165397 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v2cn\" (UniqueName: \"kubernetes.io/projected/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-kube-api-access-7v2cn\") pod \"certified-operators-7ggmg\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.165465 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-catalog-content\") pod \"certified-operators-7ggmg\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.165538 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-utilities\") pod \"certified-operators-7ggmg\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.266873 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v2cn\" (UniqueName: \"kubernetes.io/projected/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-kube-api-access-7v2cn\") pod \"certified-operators-7ggmg\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.266932 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-catalog-content\") pod \"certified-operators-7ggmg\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.266982 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-utilities\") pod \"certified-operators-7ggmg\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.267389 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-catalog-content\") pod \"certified-operators-7ggmg\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.267451 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-utilities\") pod \"certified-operators-7ggmg\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.293932 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v2cn\" (UniqueName: \"kubernetes.io/projected/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-kube-api-access-7v2cn\") pod \"certified-operators-7ggmg\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.417138 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.627361 4935 generic.go:334] "Generic (PLEG): container finished" podID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerID="5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82" exitCode=0 Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.627420 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tc5dx" event={"ID":"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b","Type":"ContainerDied","Data":"5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82"} Dec 17 09:20:09 crc kubenswrapper[4935]: I1217 09:20:09.954463 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7ggmg"] Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.143783 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm"] Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.144601 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" Dec 17 09:20:10 crc kubenswrapper[4935]: W1217 09:20:10.147068 4935 reflector.go:561] object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-8wr5w": failed to list *v1.Secret: secrets "openstack-operator-controller-operator-dockercfg-8wr5w" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Dec 17 09:20:10 crc kubenswrapper[4935]: E1217 09:20:10.147123 4935 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"openstack-operator-controller-operator-dockercfg-8wr5w\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openstack-operator-controller-operator-dockercfg-8wr5w\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.178815 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm"] Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.285939 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjg8k\" (UniqueName: \"kubernetes.io/projected/28ce268a-b7ac-4692-8870-063d1a26b9dc-kube-api-access-mjg8k\") pod \"openstack-operator-controller-operator-7ff595d8cc-s79xm\" (UID: \"28ce268a-b7ac-4692-8870-063d1a26b9dc\") " pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.387743 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjg8k\" (UniqueName: \"kubernetes.io/projected/28ce268a-b7ac-4692-8870-063d1a26b9dc-kube-api-access-mjg8k\") pod \"openstack-operator-controller-operator-7ff595d8cc-s79xm\" (UID: \"28ce268a-b7ac-4692-8870-063d1a26b9dc\") " pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.414058 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjg8k\" (UniqueName: \"kubernetes.io/projected/28ce268a-b7ac-4692-8870-063d1a26b9dc-kube-api-access-mjg8k\") pod \"openstack-operator-controller-operator-7ff595d8cc-s79xm\" (UID: \"28ce268a-b7ac-4692-8870-063d1a26b9dc\") " pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.636942 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tc5dx" event={"ID":"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b","Type":"ContainerStarted","Data":"0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea"} Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.638843 4935 generic.go:334] "Generic (PLEG): container finished" podID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerID="f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3" exitCode=0 Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.638885 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggmg" event={"ID":"e4092bc5-1a28-49f8-af94-6f6714b2a2ff","Type":"ContainerDied","Data":"f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3"} Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.638911 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggmg" event={"ID":"e4092bc5-1a28-49f8-af94-6f6714b2a2ff","Type":"ContainerStarted","Data":"f276a85d6c8f28411bf2c8a0fe31d77dad854611123f76f200b2925f8004f891"} Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.659309 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tc5dx" podStartSLOduration=1.938244394 podStartE2EDuration="4.659266871s" podCreationTimestamp="2025-12-17 09:20:06 +0000 UTC" firstStartedPulling="2025-12-17 09:20:07.608324485 +0000 UTC m=+927.268165248" lastFinishedPulling="2025-12-17 09:20:10.329346962 +0000 UTC m=+929.989187725" observedRunningTime="2025-12-17 09:20:10.65837987 +0000 UTC m=+930.318220633" watchObservedRunningTime="2025-12-17 09:20:10.659266871 +0000 UTC m=+930.319107634" Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.889119 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ctbzf"] Dec 17 09:20:10 crc kubenswrapper[4935]: I1217 09:20:10.889464 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ctbzf" podUID="02c86c70-e631-432e-8555-c769bf3e94a8" containerName="registry-server" containerID="cri-o://a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779" gracePeriod=2 Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.308331 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.402426 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-utilities\") pod \"02c86c70-e631-432e-8555-c769bf3e94a8\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.402548 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84wtq\" (UniqueName: \"kubernetes.io/projected/02c86c70-e631-432e-8555-c769bf3e94a8-kube-api-access-84wtq\") pod \"02c86c70-e631-432e-8555-c769bf3e94a8\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.402599 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-catalog-content\") pod \"02c86c70-e631-432e-8555-c769bf3e94a8\" (UID: \"02c86c70-e631-432e-8555-c769bf3e94a8\") " Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.403656 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-utilities" (OuterVolumeSpecName: "utilities") pod "02c86c70-e631-432e-8555-c769bf3e94a8" (UID: "02c86c70-e631-432e-8555-c769bf3e94a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.408669 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02c86c70-e631-432e-8555-c769bf3e94a8-kube-api-access-84wtq" (OuterVolumeSpecName: "kube-api-access-84wtq") pod "02c86c70-e631-432e-8555-c769bf3e94a8" (UID: "02c86c70-e631-432e-8555-c769bf3e94a8"). InnerVolumeSpecName "kube-api-access-84wtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.447809 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02c86c70-e631-432e-8555-c769bf3e94a8" (UID: "02c86c70-e631-432e-8555-c769bf3e94a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.464098 4935 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" secret="" err="failed to sync secret cache: timed out waiting for the condition" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.464195 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.504208 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.504247 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84wtq\" (UniqueName: \"kubernetes.io/projected/02c86c70-e631-432e-8555-c769bf3e94a8-kube-api-access-84wtq\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.504262 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02c86c70-e631-432e-8555-c769bf3e94a8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.648871 4935 generic.go:334] "Generic (PLEG): container finished" podID="02c86c70-e631-432e-8555-c769bf3e94a8" containerID="a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779" exitCode=0 Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.649071 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctbzf" event={"ID":"02c86c70-e631-432e-8555-c769bf3e94a8","Type":"ContainerDied","Data":"a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779"} Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.649595 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ctbzf" event={"ID":"02c86c70-e631-432e-8555-c769bf3e94a8","Type":"ContainerDied","Data":"b87a3a979487bbdbafd5a10b10bb773ade6cc4e793fb027c20cf35909f2f9c4e"} Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.649630 4935 scope.go:117] "RemoveContainer" containerID="a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.649181 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ctbzf" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.684983 4935 scope.go:117] "RemoveContainer" containerID="f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.687217 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ctbzf"] Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.698028 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ctbzf"] Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.715520 4935 scope.go:117] "RemoveContainer" containerID="7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.720824 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm"] Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.721505 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-8wr5w" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.734369 4935 scope.go:117] "RemoveContainer" containerID="a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779" Dec 17 09:20:11 crc kubenswrapper[4935]: E1217 09:20:11.735042 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779\": container with ID starting with a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779 not found: ID does not exist" containerID="a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.735086 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779"} err="failed to get container status \"a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779\": rpc error: code = NotFound desc = could not find container \"a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779\": container with ID starting with a747317b7b951996f6594f881b180828d281c0a0eb4cbbc84cc41cc45ee27779 not found: ID does not exist" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.735113 4935 scope.go:117] "RemoveContainer" containerID="f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7" Dec 17 09:20:11 crc kubenswrapper[4935]: E1217 09:20:11.738467 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7\": container with ID starting with f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7 not found: ID does not exist" containerID="f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.738502 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7"} err="failed to get container status \"f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7\": rpc error: code = NotFound desc = could not find container \"f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7\": container with ID starting with f734f3e155622cafe841e7c280449dc16a4cc63814ccf0260cb40acdd57632b7 not found: ID does not exist" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.738526 4935 scope.go:117] "RemoveContainer" containerID="7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f" Dec 17 09:20:11 crc kubenswrapper[4935]: E1217 09:20:11.741786 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f\": container with ID starting with 7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f not found: ID does not exist" containerID="7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f" Dec 17 09:20:11 crc kubenswrapper[4935]: I1217 09:20:11.741854 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f"} err="failed to get container status \"7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f\": rpc error: code = NotFound desc = could not find container \"7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f\": container with ID starting with 7e1fc44b28a5c15c08b720c92ec82a93f3ddef0b8818f50c84a08ade0edba17f not found: ID does not exist" Dec 17 09:20:12 crc kubenswrapper[4935]: I1217 09:20:12.663457 4935 generic.go:334] "Generic (PLEG): container finished" podID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerID="3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb" exitCode=0 Dec 17 09:20:12 crc kubenswrapper[4935]: I1217 09:20:12.663629 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggmg" event={"ID":"e4092bc5-1a28-49f8-af94-6f6714b2a2ff","Type":"ContainerDied","Data":"3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb"} Dec 17 09:20:12 crc kubenswrapper[4935]: I1217 09:20:12.671841 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" event={"ID":"28ce268a-b7ac-4692-8870-063d1a26b9dc","Type":"ContainerStarted","Data":"ad5df4c77a63aa879311de5f6e481f9b07e95f582db829b4ba6c2314367227ab"} Dec 17 09:20:13 crc kubenswrapper[4935]: I1217 09:20:13.140038 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02c86c70-e631-432e-8555-c769bf3e94a8" path="/var/lib/kubelet/pods/02c86c70-e631-432e-8555-c769bf3e94a8/volumes" Dec 17 09:20:13 crc kubenswrapper[4935]: I1217 09:20:13.690464 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggmg" event={"ID":"e4092bc5-1a28-49f8-af94-6f6714b2a2ff","Type":"ContainerStarted","Data":"475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9"} Dec 17 09:20:13 crc kubenswrapper[4935]: I1217 09:20:13.714301 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7ggmg" podStartSLOduration=2.30803347 podStartE2EDuration="4.714264766s" podCreationTimestamp="2025-12-17 09:20:09 +0000 UTC" firstStartedPulling="2025-12-17 09:20:10.64037992 +0000 UTC m=+930.300220683" lastFinishedPulling="2025-12-17 09:20:13.046611216 +0000 UTC m=+932.706451979" observedRunningTime="2025-12-17 09:20:13.712708758 +0000 UTC m=+933.372549521" watchObservedRunningTime="2025-12-17 09:20:13.714264766 +0000 UTC m=+933.374105529" Dec 17 09:20:16 crc kubenswrapper[4935]: I1217 09:20:16.648436 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:16 crc kubenswrapper[4935]: I1217 09:20:16.650929 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:16 crc kubenswrapper[4935]: I1217 09:20:16.787918 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:17 crc kubenswrapper[4935]: I1217 09:20:17.870147 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:19 crc kubenswrapper[4935]: I1217 09:20:19.417436 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:19 crc kubenswrapper[4935]: I1217 09:20:19.417768 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:19 crc kubenswrapper[4935]: I1217 09:20:19.475932 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:19 crc kubenswrapper[4935]: I1217 09:20:19.755337 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" event={"ID":"28ce268a-b7ac-4692-8870-063d1a26b9dc","Type":"ContainerStarted","Data":"21f892da24e84989441d405ea6a31c3514c9b8a0a9cdf8e78c944288b87d86e4"} Dec 17 09:20:19 crc kubenswrapper[4935]: I1217 09:20:19.755867 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" Dec 17 09:20:19 crc kubenswrapper[4935]: I1217 09:20:19.790174 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" podStartSLOduration=2.4852677 podStartE2EDuration="9.790150923s" podCreationTimestamp="2025-12-17 09:20:10 +0000 UTC" firstStartedPulling="2025-12-17 09:20:11.734704679 +0000 UTC m=+931.394545432" lastFinishedPulling="2025-12-17 09:20:19.039587892 +0000 UTC m=+938.699428655" observedRunningTime="2025-12-17 09:20:19.787647162 +0000 UTC m=+939.447487925" watchObservedRunningTime="2025-12-17 09:20:19.790150923 +0000 UTC m=+939.449991686" Dec 17 09:20:19 crc kubenswrapper[4935]: I1217 09:20:19.824811 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:19 crc kubenswrapper[4935]: I1217 09:20:19.886923 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tc5dx"] Dec 17 09:20:19 crc kubenswrapper[4935]: I1217 09:20:19.887227 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tc5dx" podUID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerName="registry-server" containerID="cri-o://0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea" gracePeriod=2 Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.332190 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.518437 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-catalog-content\") pod \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.518599 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nbgj\" (UniqueName: \"kubernetes.io/projected/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-kube-api-access-5nbgj\") pod \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.518662 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-utilities\") pod \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\" (UID: \"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b\") " Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.519601 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-utilities" (OuterVolumeSpecName: "utilities") pod "476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" (UID: "476bb2c7-f319-4a3e-9aa9-f5f8a786b13b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.523865 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-kube-api-access-5nbgj" (OuterVolumeSpecName: "kube-api-access-5nbgj") pod "476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" (UID: "476bb2c7-f319-4a3e-9aa9-f5f8a786b13b"). InnerVolumeSpecName "kube-api-access-5nbgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.538123 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" (UID: "476bb2c7-f319-4a3e-9aa9-f5f8a786b13b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.620084 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.620197 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nbgj\" (UniqueName: \"kubernetes.io/projected/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-kube-api-access-5nbgj\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.620210 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.764111 4935 generic.go:334] "Generic (PLEG): container finished" podID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerID="0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea" exitCode=0 Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.764218 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tc5dx" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.764209 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tc5dx" event={"ID":"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b","Type":"ContainerDied","Data":"0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea"} Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.764300 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tc5dx" event={"ID":"476bb2c7-f319-4a3e-9aa9-f5f8a786b13b","Type":"ContainerDied","Data":"2a230e5bee40c4e9cceec9e323dfe5cd036ec81678eb3bd9903455dd1a29dfb5"} Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.764333 4935 scope.go:117] "RemoveContainer" containerID="0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.784810 4935 scope.go:117] "RemoveContainer" containerID="5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.818140 4935 scope.go:117] "RemoveContainer" containerID="f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.822371 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tc5dx"] Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.828728 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tc5dx"] Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.839108 4935 scope.go:117] "RemoveContainer" containerID="0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea" Dec 17 09:20:20 crc kubenswrapper[4935]: E1217 09:20:20.839886 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea\": container with ID starting with 0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea not found: ID does not exist" containerID="0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.839920 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea"} err="failed to get container status \"0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea\": rpc error: code = NotFound desc = could not find container \"0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea\": container with ID starting with 0fcd9c75714cab44771585a3931946e32f63bc151bd701b7d766d41ed99b09ea not found: ID does not exist" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.839952 4935 scope.go:117] "RemoveContainer" containerID="5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82" Dec 17 09:20:20 crc kubenswrapper[4935]: E1217 09:20:20.840429 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82\": container with ID starting with 5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82 not found: ID does not exist" containerID="5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.840458 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82"} err="failed to get container status \"5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82\": rpc error: code = NotFound desc = could not find container \"5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82\": container with ID starting with 5bd41ea241257ca78ee35b16effd50ab5fa89be24e410d599d8b8def1cb9aa82 not found: ID does not exist" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.840476 4935 scope.go:117] "RemoveContainer" containerID="f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1" Dec 17 09:20:20 crc kubenswrapper[4935]: E1217 09:20:20.840796 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1\": container with ID starting with f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1 not found: ID does not exist" containerID="f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1" Dec 17 09:20:20 crc kubenswrapper[4935]: I1217 09:20:20.840830 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1"} err="failed to get container status \"f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1\": rpc error: code = NotFound desc = could not find container \"f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1\": container with ID starting with f9f39d973fb2a898fc7be0fcb996e1e9c9bb2efac2d535af2b88b6988c42f5b1 not found: ID does not exist" Dec 17 09:20:21 crc kubenswrapper[4935]: I1217 09:20:21.132015 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" path="/var/lib/kubelet/pods/476bb2c7-f319-4a3e-9aa9-f5f8a786b13b/volumes" Dec 17 09:20:22 crc kubenswrapper[4935]: I1217 09:20:22.285934 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7ggmg"] Dec 17 09:20:22 crc kubenswrapper[4935]: I1217 09:20:22.777193 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7ggmg" podUID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerName="registry-server" containerID="cri-o://475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9" gracePeriod=2 Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.680178 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.786722 4935 generic.go:334] "Generic (PLEG): container finished" podID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerID="475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9" exitCode=0 Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.786775 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggmg" event={"ID":"e4092bc5-1a28-49f8-af94-6f6714b2a2ff","Type":"ContainerDied","Data":"475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9"} Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.786814 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7ggmg" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.786834 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7ggmg" event={"ID":"e4092bc5-1a28-49f8-af94-6f6714b2a2ff","Type":"ContainerDied","Data":"f276a85d6c8f28411bf2c8a0fe31d77dad854611123f76f200b2925f8004f891"} Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.786855 4935 scope.go:117] "RemoveContainer" containerID="475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.806838 4935 scope.go:117] "RemoveContainer" containerID="3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.836174 4935 scope.go:117] "RemoveContainer" containerID="f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.857762 4935 scope.go:117] "RemoveContainer" containerID="475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9" Dec 17 09:20:23 crc kubenswrapper[4935]: E1217 09:20:23.858533 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9\": container with ID starting with 475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9 not found: ID does not exist" containerID="475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.858569 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9"} err="failed to get container status \"475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9\": rpc error: code = NotFound desc = could not find container \"475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9\": container with ID starting with 475c962a7fbc151431a31822a2b605a3862edc27c5005ce15f8a0509fa1c2ee9 not found: ID does not exist" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.858595 4935 scope.go:117] "RemoveContainer" containerID="3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb" Dec 17 09:20:23 crc kubenswrapper[4935]: E1217 09:20:23.859074 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb\": container with ID starting with 3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb not found: ID does not exist" containerID="3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.859114 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb"} err="failed to get container status \"3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb\": rpc error: code = NotFound desc = could not find container \"3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb\": container with ID starting with 3a3c9488bdf63310268d40beec83a06ba6aa1576d696cba7648514a1a7485abb not found: ID does not exist" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.859134 4935 scope.go:117] "RemoveContainer" containerID="f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3" Dec 17 09:20:23 crc kubenswrapper[4935]: E1217 09:20:23.859694 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3\": container with ID starting with f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3 not found: ID does not exist" containerID="f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.859714 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3"} err="failed to get container status \"f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3\": rpc error: code = NotFound desc = could not find container \"f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3\": container with ID starting with f34b087e31e0f273762b7d54102afc917d2436b899532509df17c7ad0c7edfa3 not found: ID does not exist" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.865926 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-utilities\") pod \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.866067 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v2cn\" (UniqueName: \"kubernetes.io/projected/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-kube-api-access-7v2cn\") pod \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.866164 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-catalog-content\") pod \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\" (UID: \"e4092bc5-1a28-49f8-af94-6f6714b2a2ff\") " Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.868098 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-utilities" (OuterVolumeSpecName: "utilities") pod "e4092bc5-1a28-49f8-af94-6f6714b2a2ff" (UID: "e4092bc5-1a28-49f8-af94-6f6714b2a2ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.875260 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-kube-api-access-7v2cn" (OuterVolumeSpecName: "kube-api-access-7v2cn") pod "e4092bc5-1a28-49f8-af94-6f6714b2a2ff" (UID: "e4092bc5-1a28-49f8-af94-6f6714b2a2ff"). InnerVolumeSpecName "kube-api-access-7v2cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.932891 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4092bc5-1a28-49f8-af94-6f6714b2a2ff" (UID: "e4092bc5-1a28-49f8-af94-6f6714b2a2ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.967502 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v2cn\" (UniqueName: \"kubernetes.io/projected/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-kube-api-access-7v2cn\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.967573 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:23 crc kubenswrapper[4935]: I1217 09:20:23.967583 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4092bc5-1a28-49f8-af94-6f6714b2a2ff-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:20:24 crc kubenswrapper[4935]: I1217 09:20:24.124260 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7ggmg"] Dec 17 09:20:24 crc kubenswrapper[4935]: I1217 09:20:24.127901 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7ggmg"] Dec 17 09:20:25 crc kubenswrapper[4935]: I1217 09:20:25.134236 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" path="/var/lib/kubelet/pods/e4092bc5-1a28-49f8-af94-6f6714b2a2ff/volumes" Dec 17 09:20:30 crc kubenswrapper[4935]: I1217 09:20:30.130131 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:20:30 crc kubenswrapper[4935]: I1217 09:20:30.130554 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:20:31 crc kubenswrapper[4935]: I1217 09:20:31.468360 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7ff595d8cc-s79xm" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.764899 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-vzv2z"] Dec 17 09:20:50 crc kubenswrapper[4935]: E1217 09:20:50.765862 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerName="registry-server" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.765876 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerName="registry-server" Dec 17 09:20:50 crc kubenswrapper[4935]: E1217 09:20:50.765893 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerName="extract-content" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.765899 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerName="extract-content" Dec 17 09:20:50 crc kubenswrapper[4935]: E1217 09:20:50.765911 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerName="registry-server" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.765917 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerName="registry-server" Dec 17 09:20:50 crc kubenswrapper[4935]: E1217 09:20:50.765931 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c86c70-e631-432e-8555-c769bf3e94a8" containerName="extract-utilities" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.765937 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c86c70-e631-432e-8555-c769bf3e94a8" containerName="extract-utilities" Dec 17 09:20:50 crc kubenswrapper[4935]: E1217 09:20:50.765945 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerName="extract-utilities" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.765951 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerName="extract-utilities" Dec 17 09:20:50 crc kubenswrapper[4935]: E1217 09:20:50.765964 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c86c70-e631-432e-8555-c769bf3e94a8" containerName="registry-server" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.765971 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c86c70-e631-432e-8555-c769bf3e94a8" containerName="registry-server" Dec 17 09:20:50 crc kubenswrapper[4935]: E1217 09:20:50.765982 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerName="extract-content" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.765988 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerName="extract-content" Dec 17 09:20:50 crc kubenswrapper[4935]: E1217 09:20:50.765997 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerName="extract-utilities" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.766004 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerName="extract-utilities" Dec 17 09:20:50 crc kubenswrapper[4935]: E1217 09:20:50.766016 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02c86c70-e631-432e-8555-c769bf3e94a8" containerName="extract-content" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.766022 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="02c86c70-e631-432e-8555-c769bf3e94a8" containerName="extract-content" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.766132 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4092bc5-1a28-49f8-af94-6f6714b2a2ff" containerName="registry-server" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.766143 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="02c86c70-e631-432e-8555-c769bf3e94a8" containerName="registry-server" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.766155 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="476bb2c7-f319-4a3e-9aa9-f5f8a786b13b" containerName="registry-server" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.766704 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.768560 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-n92w5" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.773858 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.774751 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.776910 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-kzzzg" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.790087 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-vzv2z"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.796057 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.796989 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" Dec 17 09:20:50 crc kubenswrapper[4935]: W1217 09:20:50.805254 4935 reflector.go:561] object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-82dk7": failed to list *v1.Secret: secrets "designate-operator-controller-manager-dockercfg-82dk7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Dec 17 09:20:50 crc kubenswrapper[4935]: E1217 09:20:50.805323 4935 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"designate-operator-controller-manager-dockercfg-82dk7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"designate-operator-controller-manager-dockercfg-82dk7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.814025 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.817072 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.829263 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4tj6\" (UniqueName: \"kubernetes.io/projected/83f649ce-a0cd-4405-9d8f-dee381d6f85a-kube-api-access-w4tj6\") pod \"cinder-operator-controller-manager-5f98b4754f-9dv4l\" (UID: \"83f649ce-a0cd-4405-9d8f-dee381d6f85a\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.829703 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvzjn\" (UniqueName: \"kubernetes.io/projected/ff77991e-cda3-4547-878f-9a2785b3a9ab-kube-api-access-nvzjn\") pod \"barbican-operator-controller-manager-95949466-vzv2z\" (UID: \"ff77991e-cda3-4547-878f-9a2785b3a9ab\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.844013 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.845108 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.847695 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-t498q" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.865560 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.870284 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.871195 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.875012 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.875403 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-b5r5m" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.876026 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.888511 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.889110 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jxzwn" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.902559 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.931349 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4tj6\" (UniqueName: \"kubernetes.io/projected/83f649ce-a0cd-4405-9d8f-dee381d6f85a-kube-api-access-w4tj6\") pod \"cinder-operator-controller-manager-5f98b4754f-9dv4l\" (UID: \"83f649ce-a0cd-4405-9d8f-dee381d6f85a\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.931400 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxmkx\" (UniqueName: \"kubernetes.io/projected/97f9414d-21a1-41dd-a4b0-cccffa57d46a-kube-api-access-gxmkx\") pod \"glance-operator-controller-manager-767f9d7567-94b2l\" (UID: \"97f9414d-21a1-41dd-a4b0-cccffa57d46a\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.931451 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnr42\" (UniqueName: \"kubernetes.io/projected/48d7ed73-e01d-48c1-98a4-22c4b3af76e3-kube-api-access-cnr42\") pod \"heat-operator-controller-manager-59b8dcb766-vwxjb\" (UID: \"48d7ed73-e01d-48c1-98a4-22c4b3af76e3\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.931500 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74dxc\" (UniqueName: \"kubernetes.io/projected/dec986d2-14bc-4419-8e9c-9a6f4b1959d2-kube-api-access-74dxc\") pod \"horizon-operator-controller-manager-6ccf486b9-6jdgb\" (UID: \"dec986d2-14bc-4419-8e9c-9a6f4b1959d2\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.931519 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqlf5\" (UniqueName: \"kubernetes.io/projected/b7549908-f751-4d15-bac6-e8ebcb550a55-kube-api-access-lqlf5\") pod \"designate-operator-controller-manager-66f8b87655-2t5dm\" (UID: \"b7549908-f751-4d15-bac6-e8ebcb550a55\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.931546 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvzjn\" (UniqueName: \"kubernetes.io/projected/ff77991e-cda3-4547-878f-9a2785b3a9ab-kube-api-access-nvzjn\") pod \"barbican-operator-controller-manager-95949466-vzv2z\" (UID: \"ff77991e-cda3-4547-878f-9a2785b3a9ab\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.932079 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.932961 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.938633 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-gszjj" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.942913 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.943992 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.946683 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-bg29x" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.950609 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.959060 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr"] Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.962103 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvzjn\" (UniqueName: \"kubernetes.io/projected/ff77991e-cda3-4547-878f-9a2785b3a9ab-kube-api-access-nvzjn\") pod \"barbican-operator-controller-manager-95949466-vzv2z\" (UID: \"ff77991e-cda3-4547-878f-9a2785b3a9ab\") " pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" Dec 17 09:20:50 crc kubenswrapper[4935]: I1217 09:20:50.975173 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4tj6\" (UniqueName: \"kubernetes.io/projected/83f649ce-a0cd-4405-9d8f-dee381d6f85a-kube-api-access-w4tj6\") pod \"cinder-operator-controller-manager-5f98b4754f-9dv4l\" (UID: \"83f649ce-a0cd-4405-9d8f-dee381d6f85a\") " pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.001254 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.002292 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.010642 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4k7ll" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.010716 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.011850 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.018411 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.019182 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lxl5m" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.019473 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.021918 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-x66z5" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.025669 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.033047 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74dxc\" (UniqueName: \"kubernetes.io/projected/dec986d2-14bc-4419-8e9c-9a6f4b1959d2-kube-api-access-74dxc\") pod \"horizon-operator-controller-manager-6ccf486b9-6jdgb\" (UID: \"dec986d2-14bc-4419-8e9c-9a6f4b1959d2\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.033090 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqlf5\" (UniqueName: \"kubernetes.io/projected/b7549908-f751-4d15-bac6-e8ebcb550a55-kube-api-access-lqlf5\") pod \"designate-operator-controller-manager-66f8b87655-2t5dm\" (UID: \"b7549908-f751-4d15-bac6-e8ebcb550a55\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.033132 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxmkx\" (UniqueName: \"kubernetes.io/projected/97f9414d-21a1-41dd-a4b0-cccffa57d46a-kube-api-access-gxmkx\") pod \"glance-operator-controller-manager-767f9d7567-94b2l\" (UID: \"97f9414d-21a1-41dd-a4b0-cccffa57d46a\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.033172 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twhsk\" (UniqueName: \"kubernetes.io/projected/8e14339f-174c-4065-8021-a3a8e56b7e16-kube-api-access-twhsk\") pod \"ironic-operator-controller-manager-f458558d7-tq9rr\" (UID: \"8e14339f-174c-4065-8021-a3a8e56b7e16\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.033207 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnr42\" (UniqueName: \"kubernetes.io/projected/48d7ed73-e01d-48c1-98a4-22c4b3af76e3-kube-api-access-cnr42\") pod \"heat-operator-controller-manager-59b8dcb766-vwxjb\" (UID: \"48d7ed73-e01d-48c1-98a4-22c4b3af76e3\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.033229 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.033261 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9sk4\" (UniqueName: \"kubernetes.io/projected/58b1e21c-930d-4c0c-9469-3e37fd64b23d-kube-api-access-q9sk4\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.042382 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.047338 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.051901 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.053035 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.055658 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnr42\" (UniqueName: \"kubernetes.io/projected/48d7ed73-e01d-48c1-98a4-22c4b3af76e3-kube-api-access-cnr42\") pod \"heat-operator-controller-manager-59b8dcb766-vwxjb\" (UID: \"48d7ed73-e01d-48c1-98a4-22c4b3af76e3\") " pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.055948 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.076932 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqlf5\" (UniqueName: \"kubernetes.io/projected/b7549908-f751-4d15-bac6-e8ebcb550a55-kube-api-access-lqlf5\") pod \"designate-operator-controller-manager-66f8b87655-2t5dm\" (UID: \"b7549908-f751-4d15-bac6-e8ebcb550a55\") " pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.093885 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74dxc\" (UniqueName: \"kubernetes.io/projected/dec986d2-14bc-4419-8e9c-9a6f4b1959d2-kube-api-access-74dxc\") pod \"horizon-operator-controller-manager-6ccf486b9-6jdgb\" (UID: \"dec986d2-14bc-4419-8e9c-9a6f4b1959d2\") " pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.094701 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxmkx\" (UniqueName: \"kubernetes.io/projected/97f9414d-21a1-41dd-a4b0-cccffa57d46a-kube-api-access-gxmkx\") pod \"glance-operator-controller-manager-767f9d7567-94b2l\" (UID: \"97f9414d-21a1-41dd-a4b0-cccffa57d46a\") " pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.098032 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qmfwk" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.114630 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.115461 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.164721 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twhsk\" (UniqueName: \"kubernetes.io/projected/8e14339f-174c-4065-8021-a3a8e56b7e16-kube-api-access-twhsk\") pod \"ironic-operator-controller-manager-f458558d7-tq9rr\" (UID: \"8e14339f-174c-4065-8021-a3a8e56b7e16\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.164805 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8hb4\" (UniqueName: \"kubernetes.io/projected/d8de4c04-1b17-45ca-9084-d69cd737bba2-kube-api-access-t8hb4\") pod \"mariadb-operator-controller-manager-f76f4954c-9nnjm\" (UID: \"d8de4c04-1b17-45ca-9084-d69cd737bba2\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.164834 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.164867 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trnjz\" (UniqueName: \"kubernetes.io/projected/9c887d00-51f3-4980-9b38-45f0d53780b8-kube-api-access-trnjz\") pod \"keystone-operator-controller-manager-5c7cbf548f-f9sfx\" (UID: \"9c887d00-51f3-4980-9b38-45f0d53780b8\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.164888 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9sk4\" (UniqueName: \"kubernetes.io/projected/58b1e21c-930d-4c0c-9469-3e37fd64b23d-kube-api-access-q9sk4\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.164934 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlcbk\" (UniqueName: \"kubernetes.io/projected/9c2a1fd2-b473-40ad-873d-d4d9b79d5808-kube-api-access-xlcbk\") pod \"manila-operator-controller-manager-5fdd9786f7-dvgdt\" (UID: \"9c2a1fd2-b473-40ad-873d-d4d9b79d5808\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.164959 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x497\" (UniqueName: \"kubernetes.io/projected/d7864302-210a-49dd-99ec-f33155990249-kube-api-access-6x497\") pod \"neutron-operator-controller-manager-7cd87b778f-zpm62\" (UID: \"d7864302-210a-49dd-99ec-f33155990249\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" Dec 17 09:20:51 crc kubenswrapper[4935]: E1217 09:20:51.165425 4935 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 17 09:20:51 crc kubenswrapper[4935]: E1217 09:20:51.165472 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert podName:58b1e21c-930d-4c0c-9469-3e37fd64b23d nodeName:}" failed. No retries permitted until 2025-12-17 09:20:51.665454902 +0000 UTC m=+971.325295665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert") pod "infra-operator-controller-manager-84b495f78-4g8bl" (UID: "58b1e21c-930d-4c0c-9469-3e37fd64b23d") : secret "infra-operator-webhook-server-cert" not found Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.165859 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.173175 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.190156 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.200821 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9sk4\" (UniqueName: \"kubernetes.io/projected/58b1e21c-930d-4c0c-9469-3e37fd64b23d-kube-api-access-q9sk4\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.205656 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.206781 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twhsk\" (UniqueName: \"kubernetes.io/projected/8e14339f-174c-4065-8021-a3a8e56b7e16-kube-api-access-twhsk\") pod \"ironic-operator-controller-manager-f458558d7-tq9rr\" (UID: \"8e14339f-174c-4065-8021-a3a8e56b7e16\") " pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.207221 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.212823 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.213837 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mtng4" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.239791 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.247550 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.248914 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.251754 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-gp559" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.256797 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.257126 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.263571 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.264506 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.265996 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.266641 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.270097 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8hb4\" (UniqueName: \"kubernetes.io/projected/d8de4c04-1b17-45ca-9084-d69cd737bba2-kube-api-access-t8hb4\") pod \"mariadb-operator-controller-manager-f76f4954c-9nnjm\" (UID: \"d8de4c04-1b17-45ca-9084-d69cd737bba2\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.270173 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trnjz\" (UniqueName: \"kubernetes.io/projected/9c887d00-51f3-4980-9b38-45f0d53780b8-kube-api-access-trnjz\") pod \"keystone-operator-controller-manager-5c7cbf548f-f9sfx\" (UID: \"9c887d00-51f3-4980-9b38-45f0d53780b8\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.270220 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlcbk\" (UniqueName: \"kubernetes.io/projected/9c2a1fd2-b473-40ad-873d-d4d9b79d5808-kube-api-access-xlcbk\") pod \"manila-operator-controller-manager-5fdd9786f7-dvgdt\" (UID: \"9c2a1fd2-b473-40ad-873d-d4d9b79d5808\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.270243 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x497\" (UniqueName: \"kubernetes.io/projected/d7864302-210a-49dd-99ec-f33155990249-kube-api-access-6x497\") pod \"neutron-operator-controller-manager-7cd87b778f-zpm62\" (UID: \"d7864302-210a-49dd-99ec-f33155990249\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.270267 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2kqm\" (UniqueName: \"kubernetes.io/projected/b44d3640-477b-4ab4-b514-9e8aa8f03fa4-kube-api-access-z2kqm\") pod \"nova-operator-controller-manager-5fbbf8b6cc-j59sf\" (UID: \"b44d3640-477b-4ab4-b514-9e8aa8f03fa4\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.273215 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.277332 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-g6bb8" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.277586 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.277922 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-6hhb9" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.277995 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.282977 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.284079 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.285969 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-hr5hp" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.290414 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.291551 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.295704 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bn7q4" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.303237 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.335422 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x497\" (UniqueName: \"kubernetes.io/projected/d7864302-210a-49dd-99ec-f33155990249-kube-api-access-6x497\") pod \"neutron-operator-controller-manager-7cd87b778f-zpm62\" (UID: \"d7864302-210a-49dd-99ec-f33155990249\") " pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.345489 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8hb4\" (UniqueName: \"kubernetes.io/projected/d8de4c04-1b17-45ca-9084-d69cd737bba2-kube-api-access-t8hb4\") pod \"mariadb-operator-controller-manager-f76f4954c-9nnjm\" (UID: \"d8de4c04-1b17-45ca-9084-d69cd737bba2\") " pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.348785 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlcbk\" (UniqueName: \"kubernetes.io/projected/9c2a1fd2-b473-40ad-873d-d4d9b79d5808-kube-api-access-xlcbk\") pod \"manila-operator-controller-manager-5fdd9786f7-dvgdt\" (UID: \"9c2a1fd2-b473-40ad-873d-d4d9b79d5808\") " pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.355175 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trnjz\" (UniqueName: \"kubernetes.io/projected/9c887d00-51f3-4980-9b38-45f0d53780b8-kube-api-access-trnjz\") pod \"keystone-operator-controller-manager-5c7cbf548f-f9sfx\" (UID: \"9c887d00-51f3-4980-9b38-45f0d53780b8\") " pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.371521 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2kqm\" (UniqueName: \"kubernetes.io/projected/b44d3640-477b-4ab4-b514-9e8aa8f03fa4-kube-api-access-z2kqm\") pod \"nova-operator-controller-manager-5fbbf8b6cc-j59sf\" (UID: \"b44d3640-477b-4ab4-b514-9e8aa8f03fa4\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.371578 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w2mr\" (UniqueName: \"kubernetes.io/projected/361cd3f0-4302-4641-8b23-bfdb3904015f-kube-api-access-7w2mr\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.371606 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.371639 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wzcr\" (UniqueName: \"kubernetes.io/projected/de693205-2ea2-4b43-aefc-5d4dbc8650d9-kube-api-access-4wzcr\") pod \"ovn-operator-controller-manager-bf6d4f946-5zhxw\" (UID: \"de693205-2ea2-4b43-aefc-5d4dbc8650d9\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.371682 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk6gt\" (UniqueName: \"kubernetes.io/projected/adcbaa5e-9235-4fbd-9641-929c51d02d00-kube-api-access-dk6gt\") pod \"octavia-operator-controller-manager-68c649d9d-hfklk\" (UID: \"adcbaa5e-9235-4fbd-9641-929c51d02d00\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.371734 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw92c\" (UniqueName: \"kubernetes.io/projected/f39def3f-b302-4d51-a636-752b4d23ded0-kube-api-access-sw92c\") pod \"swift-operator-controller-manager-5c6df8f9-rk9ml\" (UID: \"f39def3f-b302-4d51-a636-752b4d23ded0\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.371755 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvvwl\" (UniqueName: \"kubernetes.io/projected/08ebe1c7-8852-4b51-8042-cd2b26a5cf50-kube-api-access-gvvwl\") pod \"placement-operator-controller-manager-8665b56d78-ndj2g\" (UID: \"08ebe1c7-8852-4b51-8042-cd2b26a5cf50\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.385703 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.396019 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.397075 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.397172 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.401861 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.402818 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-724dt" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.407627 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2kqm\" (UniqueName: \"kubernetes.io/projected/b44d3640-477b-4ab4-b514-9e8aa8f03fa4-kube-api-access-z2kqm\") pod \"nova-operator-controller-manager-5fbbf8b6cc-j59sf\" (UID: \"b44d3640-477b-4ab4-b514-9e8aa8f03fa4\") " pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.408419 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.409514 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.412691 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-9jnfd" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.435376 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.441855 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.443020 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.448452 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-f8nn7" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.449117 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.479575 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlb48\" (UniqueName: \"kubernetes.io/projected/bfc47d4c-a4db-4e06-ae0b-40afebb7c42e-kube-api-access-dlb48\") pod \"telemetry-operator-controller-manager-97d456b9-2kr5x\" (UID: \"bfc47d4c-a4db-4e06-ae0b-40afebb7c42e\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.479628 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w2mr\" (UniqueName: \"kubernetes.io/projected/361cd3f0-4302-4641-8b23-bfdb3904015f-kube-api-access-7w2mr\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.479655 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.479680 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wzcr\" (UniqueName: \"kubernetes.io/projected/de693205-2ea2-4b43-aefc-5d4dbc8650d9-kube-api-access-4wzcr\") pod \"ovn-operator-controller-manager-bf6d4f946-5zhxw\" (UID: \"de693205-2ea2-4b43-aefc-5d4dbc8650d9\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.479722 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk6gt\" (UniqueName: \"kubernetes.io/projected/adcbaa5e-9235-4fbd-9641-929c51d02d00-kube-api-access-dk6gt\") pod \"octavia-operator-controller-manager-68c649d9d-hfklk\" (UID: \"adcbaa5e-9235-4fbd-9641-929c51d02d00\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.479765 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw92c\" (UniqueName: \"kubernetes.io/projected/f39def3f-b302-4d51-a636-752b4d23ded0-kube-api-access-sw92c\") pod \"swift-operator-controller-manager-5c6df8f9-rk9ml\" (UID: \"f39def3f-b302-4d51-a636-752b4d23ded0\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.479786 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvvwl\" (UniqueName: \"kubernetes.io/projected/08ebe1c7-8852-4b51-8042-cd2b26a5cf50-kube-api-access-gvvwl\") pod \"placement-operator-controller-manager-8665b56d78-ndj2g\" (UID: \"08ebe1c7-8852-4b51-8042-cd2b26a5cf50\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.479825 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvlf7\" (UniqueName: \"kubernetes.io/projected/e782899f-29fc-40c3-b7ef-41bfc66a221f-kube-api-access-hvlf7\") pod \"test-operator-controller-manager-756ccf86c7-mnxj7\" (UID: \"e782899f-29fc-40c3-b7ef-41bfc66a221f\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" Dec 17 09:20:51 crc kubenswrapper[4935]: E1217 09:20:51.480216 4935 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:20:51 crc kubenswrapper[4935]: E1217 09:20:51.480261 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert podName:361cd3f0-4302-4641-8b23-bfdb3904015f nodeName:}" failed. No retries permitted until 2025-12-17 09:20:51.980246752 +0000 UTC m=+971.640087505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert") pod "openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" (UID: "361cd3f0-4302-4641-8b23-bfdb3904015f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.519111 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.527904 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.535436 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.545162 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw92c\" (UniqueName: \"kubernetes.io/projected/f39def3f-b302-4d51-a636-752b4d23ded0-kube-api-access-sw92c\") pod \"swift-operator-controller-manager-5c6df8f9-rk9ml\" (UID: \"f39def3f-b302-4d51-a636-752b4d23ded0\") " pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.559368 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w2mr\" (UniqueName: \"kubernetes.io/projected/361cd3f0-4302-4641-8b23-bfdb3904015f-kube-api-access-7w2mr\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.567475 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk6gt\" (UniqueName: \"kubernetes.io/projected/adcbaa5e-9235-4fbd-9641-929c51d02d00-kube-api-access-dk6gt\") pod \"octavia-operator-controller-manager-68c649d9d-hfklk\" (UID: \"adcbaa5e-9235-4fbd-9641-929c51d02d00\") " pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.570112 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wzcr\" (UniqueName: \"kubernetes.io/projected/de693205-2ea2-4b43-aefc-5d4dbc8650d9-kube-api-access-4wzcr\") pod \"ovn-operator-controller-manager-bf6d4f946-5zhxw\" (UID: \"de693205-2ea2-4b43-aefc-5d4dbc8650d9\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.570940 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvvwl\" (UniqueName: \"kubernetes.io/projected/08ebe1c7-8852-4b51-8042-cd2b26a5cf50-kube-api-access-gvvwl\") pod \"placement-operator-controller-manager-8665b56d78-ndj2g\" (UID: \"08ebe1c7-8852-4b51-8042-cd2b26a5cf50\") " pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.575180 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.585134 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59g5p\" (UniqueName: \"kubernetes.io/projected/3f54488f-6b5c-458d-be0c-19b9248cc7b1-kube-api-access-59g5p\") pod \"watcher-operator-controller-manager-55f78b7c4c-6cvpq\" (UID: \"3f54488f-6b5c-458d-be0c-19b9248cc7b1\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.585243 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvlf7\" (UniqueName: \"kubernetes.io/projected/e782899f-29fc-40c3-b7ef-41bfc66a221f-kube-api-access-hvlf7\") pod \"test-operator-controller-manager-756ccf86c7-mnxj7\" (UID: \"e782899f-29fc-40c3-b7ef-41bfc66a221f\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.585295 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlb48\" (UniqueName: \"kubernetes.io/projected/bfc47d4c-a4db-4e06-ae0b-40afebb7c42e-kube-api-access-dlb48\") pod \"telemetry-operator-controller-manager-97d456b9-2kr5x\" (UID: \"bfc47d4c-a4db-4e06-ae0b-40afebb7c42e\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.670623 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.671625 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.688208 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.698352 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlb48\" (UniqueName: \"kubernetes.io/projected/bfc47d4c-a4db-4e06-ae0b-40afebb7c42e-kube-api-access-dlb48\") pod \"telemetry-operator-controller-manager-97d456b9-2kr5x\" (UID: \"bfc47d4c-a4db-4e06-ae0b-40afebb7c42e\") " pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.709200 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvlf7\" (UniqueName: \"kubernetes.io/projected/e782899f-29fc-40c3-b7ef-41bfc66a221f-kube-api-access-hvlf7\") pod \"test-operator-controller-manager-756ccf86c7-mnxj7\" (UID: \"e782899f-29fc-40c3-b7ef-41bfc66a221f\") " pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.744875 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.757607 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.776820 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59g5p\" (UniqueName: \"kubernetes.io/projected/3f54488f-6b5c-458d-be0c-19b9248cc7b1-kube-api-access-59g5p\") pod \"watcher-operator-controller-manager-55f78b7c4c-6cvpq\" (UID: \"3f54488f-6b5c-458d-be0c-19b9248cc7b1\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.776925 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:20:51 crc kubenswrapper[4935]: E1217 09:20:51.784093 4935 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 17 09:20:51 crc kubenswrapper[4935]: E1217 09:20:51.784195 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert podName:58b1e21c-930d-4c0c-9469-3e37fd64b23d nodeName:}" failed. No retries permitted until 2025-12-17 09:20:52.784170877 +0000 UTC m=+972.444011640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert") pod "infra-operator-controller-manager-84b495f78-4g8bl" (UID: "58b1e21c-930d-4c0c-9469-3e37fd64b23d") : secret "infra-operator-webhook-server-cert" not found Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.898082 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc"] Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.899937 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59g5p\" (UniqueName: \"kubernetes.io/projected/3f54488f-6b5c-458d-be0c-19b9248cc7b1-kube-api-access-59g5p\") pod \"watcher-operator-controller-manager-55f78b7c4c-6cvpq\" (UID: \"3f54488f-6b5c-458d-be0c-19b9248cc7b1\") " pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.909568 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.912515 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.912580 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdb9s\" (UniqueName: \"kubernetes.io/projected/cca50749-e7c9-4310-aaec-873208df4579-kube-api-access-gdb9s\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.912610 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.913729 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qzvsh" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.928250 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.928596 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 17 09:20:51 crc kubenswrapper[4935]: I1217 09:20:51.965577 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc"] Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.013071 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d"] Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.014241 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.016258 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.016325 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.016366 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdb9s\" (UniqueName: \"kubernetes.io/projected/cca50749-e7c9-4310-aaec-873208df4579-kube-api-access-gdb9s\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.016398 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.017576 4935 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.017640 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:20:52.517619813 +0000 UTC m=+972.177460576 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "metrics-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.017877 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-vztsk" Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.018022 4935 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.018051 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert podName:361cd3f0-4302-4641-8b23-bfdb3904015f nodeName:}" failed. No retries permitted until 2025-12-17 09:20:53.018042943 +0000 UTC m=+972.677883706 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert") pod "openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" (UID: "361cd3f0-4302-4641-8b23-bfdb3904015f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.018378 4935 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.028785 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:20:52.528761215 +0000 UTC m=+972.188601978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "webhook-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.022223 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d"] Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.044014 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.055718 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdb9s\" (UniqueName: \"kubernetes.io/projected/cca50749-e7c9-4310-aaec-873208df4579-kube-api-access-gdb9s\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.089677 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.116971 4935 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" secret="" err="failed to sync secret cache: timed out waiting for the condition" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.117064 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.120472 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5z7kj\" (UniqueName: \"kubernetes.io/projected/80038f1d-56db-4e70-91cf-3cec348298cc-kube-api-access-5z7kj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xlq6d\" (UID: \"80038f1d-56db-4e70-91cf-3cec348298cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.211107 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-82dk7" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.221889 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5z7kj\" (UniqueName: \"kubernetes.io/projected/80038f1d-56db-4e70-91cf-3cec348298cc-kube-api-access-5z7kj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xlq6d\" (UID: \"80038f1d-56db-4e70-91cf-3cec348298cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.250003 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5z7kj\" (UniqueName: \"kubernetes.io/projected/80038f1d-56db-4e70-91cf-3cec348298cc-kube-api-access-5z7kj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xlq6d\" (UID: \"80038f1d-56db-4e70-91cf-3cec348298cc\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.521518 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.538457 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.538564 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.538757 4935 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.538831 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:20:53.538811099 +0000 UTC m=+973.198651872 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "metrics-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.539378 4935 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.539422 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:20:53.539409224 +0000 UTC m=+973.199249987 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "webhook-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.848258 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.848460 4935 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: E1217 09:20:52.848548 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert podName:58b1e21c-930d-4c0c-9469-3e37fd64b23d nodeName:}" failed. No retries permitted until 2025-12-17 09:20:54.848520295 +0000 UTC m=+974.508361058 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert") pod "infra-operator-controller-manager-84b495f78-4g8bl" (UID: "58b1e21c-930d-4c0c-9469-3e37fd64b23d") : secret "infra-operator-webhook-server-cert" not found Dec 17 09:20:52 crc kubenswrapper[4935]: I1217 09:20:52.900865 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb"] Dec 17 09:20:53 crc kubenswrapper[4935]: I1217 09:20:53.051937 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:20:53 crc kubenswrapper[4935]: E1217 09:20:53.052125 4935 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:20:53 crc kubenswrapper[4935]: E1217 09:20:53.052201 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert podName:361cd3f0-4302-4641-8b23-bfdb3904015f nodeName:}" failed. No retries permitted until 2025-12-17 09:20:55.052178034 +0000 UTC m=+974.712018797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert") pod "openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" (UID: "361cd3f0-4302-4641-8b23-bfdb3904015f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:20:53 crc kubenswrapper[4935]: I1217 09:20:53.101356 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb" event={"ID":"dec986d2-14bc-4419-8e9c-9a6f4b1959d2","Type":"ContainerStarted","Data":"adc7f5f0be3f35d9eb9ce661f5aa315ef575f772fe9d19a45c31d13e545b7948"} Dec 17 09:20:53 crc kubenswrapper[4935]: I1217 09:20:53.599659 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:53 crc kubenswrapper[4935]: I1217 09:20:53.600639 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:53 crc kubenswrapper[4935]: E1217 09:20:53.600169 4935 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 17 09:20:53 crc kubenswrapper[4935]: E1217 09:20:53.600863 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:20:55.60084141 +0000 UTC m=+975.260682173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "webhook-server-cert" not found Dec 17 09:20:53 crc kubenswrapper[4935]: E1217 09:20:53.600798 4935 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 17 09:20:53 crc kubenswrapper[4935]: E1217 09:20:53.601454 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:20:55.601411224 +0000 UTC m=+975.261251987 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "metrics-server-cert" not found Dec 17 09:20:53 crc kubenswrapper[4935]: I1217 09:20:53.754985 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l"] Dec 17 09:20:53 crc kubenswrapper[4935]: W1217 09:20:53.761423 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83f649ce_a0cd_4405_9d8f_dee381d6f85a.slice/crio-a024ba283e8de7ccde6dfbc1db3187856a52644ddf59f7fe9ee366ed45e10b95 WatchSource:0}: Error finding container a024ba283e8de7ccde6dfbc1db3187856a52644ddf59f7fe9ee366ed45e10b95: Status 404 returned error can't find the container with id a024ba283e8de7ccde6dfbc1db3187856a52644ddf59f7fe9ee366ed45e10b95 Dec 17 09:20:53 crc kubenswrapper[4935]: W1217 09:20:53.765142 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f9414d_21a1_41dd_a4b0_cccffa57d46a.slice/crio-8de63d2c2657e8aa1a5cf285688f756ff542718e685c3a763c057db01b0d2aa9 WatchSource:0}: Error finding container 8de63d2c2657e8aa1a5cf285688f756ff542718e685c3a763c057db01b0d2aa9: Status 404 returned error can't find the container with id 8de63d2c2657e8aa1a5cf285688f756ff542718e685c3a763c057db01b0d2aa9 Dec 17 09:20:53 crc kubenswrapper[4935]: I1217 09:20:53.765325 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l"] Dec 17 09:20:53 crc kubenswrapper[4935]: I1217 09:20:53.950423 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.020386 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.037632 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-95949466-vzv2z"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.077843 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.095858 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.118426 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.121194 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" event={"ID":"83f649ce-a0cd-4405-9d8f-dee381d6f85a","Type":"ContainerStarted","Data":"a024ba283e8de7ccde6dfbc1db3187856a52644ddf59f7fe9ee366ed45e10b95"} Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.124235 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" event={"ID":"d8de4c04-1b17-45ca-9084-d69cd737bba2","Type":"ContainerStarted","Data":"7d250a524fac02e8ac6cbc48b9f0328c309367990776bd40ff57efa0e8b50d87"} Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.125689 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb" event={"ID":"48d7ed73-e01d-48c1-98a4-22c4b3af76e3","Type":"ContainerStarted","Data":"f48b23dd2d3edfdcd0216907224757d56356f3768e306dd7afe19b5192bf4f9d"} Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.126802 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" event={"ID":"f39def3f-b302-4d51-a636-752b4d23ded0","Type":"ContainerStarted","Data":"96bea46879892eaba64f76f9d58d8ca9b3685338008397ce165c14d9e2f85949"} Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.128623 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" event={"ID":"8e14339f-174c-4065-8021-a3a8e56b7e16","Type":"ContainerStarted","Data":"0b842ab441e1b7aae3b78fa5a29670ec12604dd00ecf22c4dfe4ddd9c03af319"} Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.129927 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" event={"ID":"97f9414d-21a1-41dd-a4b0-cccffa57d46a","Type":"ContainerStarted","Data":"8de63d2c2657e8aa1a5cf285688f756ff542718e685c3a763c057db01b0d2aa9"} Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.145716 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" event={"ID":"ff77991e-cda3-4547-878f-9a2785b3a9ab","Type":"ContainerStarted","Data":"edda47dbc7818fee0827e4ca757b21a08cf01a9422fa30b019107d3fed93548c"} Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.415034 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.441408 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.447603 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.461684 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.476221 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk"] Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.481970 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dk6gt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-68c649d9d-hfklk_openstack-operators(adcbaa5e-9235-4fbd-9641-929c51d02d00): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.483131 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4wzcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bf6d4f946-5zhxw_openstack-operators(de693205-2ea2-4b43-aefc-5d4dbc8650d9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.483205 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" podUID="adcbaa5e-9235-4fbd-9641-929c51d02d00" Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.484930 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62"] Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.484982 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" podUID="de693205-2ea2-4b43-aefc-5d4dbc8650d9" Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.491463 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.511678 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.531816 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d"] Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.537211 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq"] Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.544064 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gvvwl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8665b56d78-ndj2g_openstack-operators(08ebe1c7-8852-4b51-8042-cd2b26a5cf50): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.545189 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" podUID="08ebe1c7-8852-4b51-8042-cd2b26a5cf50" Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.547619 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5z7kj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xlq6d_openstack-operators(80038f1d-56db-4e70-91cf-3cec348298cc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.549395 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" podUID="80038f1d-56db-4e70-91cf-3cec348298cc" Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.551346 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf"] Dec 17 09:20:54 crc kubenswrapper[4935]: W1217 09:20:54.553515 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f54488f_6b5c_458d_be0c_19b9248cc7b1.slice/crio-cbf216ff18af1609d61ebcf54d9a777a86fc5d93ed21d684ce77280c61cdcf21 WatchSource:0}: Error finding container cbf216ff18af1609d61ebcf54d9a777a86fc5d93ed21d684ce77280c61cdcf21: Status 404 returned error can't find the container with id cbf216ff18af1609d61ebcf54d9a777a86fc5d93ed21d684ce77280c61cdcf21 Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.556030 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2kqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5fbbf8b6cc-j59sf_openstack-operators(b44d3640-477b-4ab4-b514-9e8aa8f03fa4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.560375 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" podUID="b44d3640-477b-4ab4-b514-9e8aa8f03fa4" Dec 17 09:20:54 crc kubenswrapper[4935]: I1217 09:20:54.923459 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.923649 4935 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 17 09:20:54 crc kubenswrapper[4935]: E1217 09:20:54.923919 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert podName:58b1e21c-930d-4c0c-9469-3e37fd64b23d nodeName:}" failed. No retries permitted until 2025-12-17 09:20:58.92389982 +0000 UTC m=+978.583740583 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert") pod "infra-operator-controller-manager-84b495f78-4g8bl" (UID: "58b1e21c-930d-4c0c-9469-3e37fd64b23d") : secret "infra-operator-webhook-server-cert" not found Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.125478 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.125884 4935 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.125948 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert podName:361cd3f0-4302-4641-8b23-bfdb3904015f nodeName:}" failed. No retries permitted until 2025-12-17 09:20:59.125927479 +0000 UTC m=+978.785768242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert") pod "openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" (UID: "361cd3f0-4302-4641-8b23-bfdb3904015f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.156187 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx" event={"ID":"9c887d00-51f3-4980-9b38-45f0d53780b8","Type":"ContainerStarted","Data":"f0faf46f8c3b1b32363b954feb9b97c449495d6b67296f3341aea0123db4e44e"} Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.157987 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" event={"ID":"3f54488f-6b5c-458d-be0c-19b9248cc7b1","Type":"ContainerStarted","Data":"cbf216ff18af1609d61ebcf54d9a777a86fc5d93ed21d684ce77280c61cdcf21"} Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.161183 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" event={"ID":"adcbaa5e-9235-4fbd-9641-929c51d02d00","Type":"ContainerStarted","Data":"f74d89a1e2d3065fd9a9a79a1bc8b593a01bc155815efd2ad3f123463340ef2e"} Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.163302 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" podUID="adcbaa5e-9235-4fbd-9641-929c51d02d00" Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.164661 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" event={"ID":"80038f1d-56db-4e70-91cf-3cec348298cc","Type":"ContainerStarted","Data":"a12988345fc4d07cff9d10e62807beae03c6d0733a4fa9d3f8d52ab6081d5a29"} Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.166577 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" podUID="80038f1d-56db-4e70-91cf-3cec348298cc" Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.183764 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" event={"ID":"08ebe1c7-8852-4b51-8042-cd2b26a5cf50","Type":"ContainerStarted","Data":"a9ca4e1348e533572dc8171e16900f64b321f0a3fe5d7fd2580350594a50897e"} Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.188735 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" podUID="08ebe1c7-8852-4b51-8042-cd2b26a5cf50" Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.190818 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" event={"ID":"de693205-2ea2-4b43-aefc-5d4dbc8650d9","Type":"ContainerStarted","Data":"73eb90195c84693c84371f0ea4c42a9dc31b1cd72af0468ef90ad3431cec42f0"} Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.193193 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" podUID="de693205-2ea2-4b43-aefc-5d4dbc8650d9" Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.194929 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" event={"ID":"b7549908-f751-4d15-bac6-e8ebcb550a55","Type":"ContainerStarted","Data":"3e34dca486f8a097935d804de0140fb325a8d21dbc13601a3ba977694561e4f5"} Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.206001 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" event={"ID":"b44d3640-477b-4ab4-b514-9e8aa8f03fa4","Type":"ContainerStarted","Data":"1fe0d4b785ba296e4a7102ebcf098de48661475f68a085286d51f4da7be2a8ab"} Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.209227 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" podUID="b44d3640-477b-4ab4-b514-9e8aa8f03fa4" Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.209400 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" event={"ID":"e782899f-29fc-40c3-b7ef-41bfc66a221f","Type":"ContainerStarted","Data":"0db48ca93bc357a416d3a62daea51c1955dbadc3d6f6a6896e353441e89079ae"} Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.212117 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" event={"ID":"9c2a1fd2-b473-40ad-873d-d4d9b79d5808","Type":"ContainerStarted","Data":"f44cdf2ac13d3be829960fb0e71fbcf28fdf6d16084978517a6f4cb6f39883b4"} Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.225585 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x" event={"ID":"bfc47d4c-a4db-4e06-ae0b-40afebb7c42e","Type":"ContainerStarted","Data":"6c8a0e414223754c88eb547dc1ebd6317b95b88e456266f93af0b0968e393258"} Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.229574 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" event={"ID":"d7864302-210a-49dd-99ec-f33155990249","Type":"ContainerStarted","Data":"5e3bb409a41ab235d42d54581278da8f80af370f4f297721c6543270768d69dd"} Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.633970 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:55 crc kubenswrapper[4935]: I1217 09:20:55.634045 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.634293 4935 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.634348 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:20:59.634334903 +0000 UTC m=+979.294175656 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "metrics-server-cert" not found Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.634738 4935 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 17 09:20:55 crc kubenswrapper[4935]: E1217 09:20:55.634766 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:20:59.634758333 +0000 UTC m=+979.294599096 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "webhook-server-cert" not found Dec 17 09:20:56 crc kubenswrapper[4935]: E1217 09:20:56.242748 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" podUID="de693205-2ea2-4b43-aefc-5d4dbc8650d9" Dec 17 09:20:56 crc kubenswrapper[4935]: E1217 09:20:56.244283 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" podUID="adcbaa5e-9235-4fbd-9641-929c51d02d00" Dec 17 09:20:56 crc kubenswrapper[4935]: E1217 09:20:56.244363 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" podUID="b44d3640-477b-4ab4-b514-9e8aa8f03fa4" Dec 17 09:20:56 crc kubenswrapper[4935]: E1217 09:20:56.245441 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" podUID="08ebe1c7-8852-4b51-8042-cd2b26a5cf50" Dec 17 09:20:56 crc kubenswrapper[4935]: E1217 09:20:56.249634 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" podUID="80038f1d-56db-4e70-91cf-3cec348298cc" Dec 17 09:20:58 crc kubenswrapper[4935]: I1217 09:20:58.929075 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:20:58 crc kubenswrapper[4935]: E1217 09:20:58.929767 4935 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 17 09:20:58 crc kubenswrapper[4935]: E1217 09:20:58.929839 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert podName:58b1e21c-930d-4c0c-9469-3e37fd64b23d nodeName:}" failed. No retries permitted until 2025-12-17 09:21:06.929815665 +0000 UTC m=+986.589656428 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert") pod "infra-operator-controller-manager-84b495f78-4g8bl" (UID: "58b1e21c-930d-4c0c-9469-3e37fd64b23d") : secret "infra-operator-webhook-server-cert" not found Dec 17 09:20:59 crc kubenswrapper[4935]: I1217 09:20:59.132465 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:20:59 crc kubenswrapper[4935]: E1217 09:20:59.132702 4935 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:20:59 crc kubenswrapper[4935]: E1217 09:20:59.132764 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert podName:361cd3f0-4302-4641-8b23-bfdb3904015f nodeName:}" failed. No retries permitted until 2025-12-17 09:21:07.132750056 +0000 UTC m=+986.792590819 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert") pod "openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" (UID: "361cd3f0-4302-4641-8b23-bfdb3904015f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:20:59 crc kubenswrapper[4935]: I1217 09:20:59.639206 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:59 crc kubenswrapper[4935]: I1217 09:20:59.639289 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:20:59 crc kubenswrapper[4935]: E1217 09:20:59.639483 4935 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 17 09:20:59 crc kubenswrapper[4935]: E1217 09:20:59.639541 4935 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 17 09:20:59 crc kubenswrapper[4935]: E1217 09:20:59.639558 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:21:07.639540611 +0000 UTC m=+987.299381374 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "metrics-server-cert" not found Dec 17 09:20:59 crc kubenswrapper[4935]: E1217 09:20:59.639688 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:21:07.639650893 +0000 UTC m=+987.299491656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "webhook-server-cert" not found Dec 17 09:21:00 crc kubenswrapper[4935]: I1217 09:21:00.131691 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:21:00 crc kubenswrapper[4935]: I1217 09:21:00.132349 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:21:00 crc kubenswrapper[4935]: I1217 09:21:00.132444 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:21:00 crc kubenswrapper[4935]: I1217 09:21:00.133652 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4beda71bb40d3ba3cd8691bc4fb1531ee2f28ecc5ac8964f7b4a375581ddcde6"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:21:00 crc kubenswrapper[4935]: I1217 09:21:00.133780 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://4beda71bb40d3ba3cd8691bc4fb1531ee2f28ecc5ac8964f7b4a375581ddcde6" gracePeriod=600 Dec 17 09:21:01 crc kubenswrapper[4935]: I1217 09:21:01.331235 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="4beda71bb40d3ba3cd8691bc4fb1531ee2f28ecc5ac8964f7b4a375581ddcde6" exitCode=0 Dec 17 09:21:01 crc kubenswrapper[4935]: I1217 09:21:01.331308 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"4beda71bb40d3ba3cd8691bc4fb1531ee2f28ecc5ac8964f7b4a375581ddcde6"} Dec 17 09:21:01 crc kubenswrapper[4935]: I1217 09:21:01.331371 4935 scope.go:117] "RemoveContainer" containerID="0452e123d683b48fb5da02863983a6b9a8cd0b2246d43611082e994f7f11e20b" Dec 17 09:21:07 crc kubenswrapper[4935]: I1217 09:21:07.008941 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:21:07 crc kubenswrapper[4935]: E1217 09:21:07.009142 4935 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 17 09:21:07 crc kubenswrapper[4935]: E1217 09:21:07.009684 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert podName:58b1e21c-930d-4c0c-9469-3e37fd64b23d nodeName:}" failed. No retries permitted until 2025-12-17 09:21:23.009660505 +0000 UTC m=+1002.669501288 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert") pod "infra-operator-controller-manager-84b495f78-4g8bl" (UID: "58b1e21c-930d-4c0c-9469-3e37fd64b23d") : secret "infra-operator-webhook-server-cert" not found Dec 17 09:21:07 crc kubenswrapper[4935]: I1217 09:21:07.213098 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:21:07 crc kubenswrapper[4935]: E1217 09:21:07.213423 4935 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:21:07 crc kubenswrapper[4935]: E1217 09:21:07.213613 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert podName:361cd3f0-4302-4641-8b23-bfdb3904015f nodeName:}" failed. No retries permitted until 2025-12-17 09:21:23.213566599 +0000 UTC m=+1002.873407552 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert") pod "openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" (UID: "361cd3f0-4302-4641-8b23-bfdb3904015f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 17 09:21:07 crc kubenswrapper[4935]: I1217 09:21:07.720855 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:21:07 crc kubenswrapper[4935]: I1217 09:21:07.721470 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:21:07 crc kubenswrapper[4935]: E1217 09:21:07.721176 4935 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 17 09:21:07 crc kubenswrapper[4935]: E1217 09:21:07.721665 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:21:23.721622484 +0000 UTC m=+1003.381463457 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "webhook-server-cert" not found Dec 17 09:21:07 crc kubenswrapper[4935]: E1217 09:21:07.721719 4935 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 17 09:21:07 crc kubenswrapper[4935]: E1217 09:21:07.721820 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs podName:cca50749-e7c9-4310-aaec-873208df4579 nodeName:}" failed. No retries permitted until 2025-12-17 09:21:23.721797098 +0000 UTC m=+1003.381638241 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs") pod "openstack-operator-controller-manager-5c897cfd74-2kxtc" (UID: "cca50749-e7c9-4310-aaec-873208df4579") : secret "metrics-server-cert" not found Dec 17 09:21:09 crc kubenswrapper[4935]: E1217 09:21:09.077861 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 17 09:21:09 crc kubenswrapper[4935]: E1217 09:21:09.078163 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hvlf7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-756ccf86c7-mnxj7_openstack-operators(e782899f-29fc-40c3-b7ef-41bfc66a221f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:09 crc kubenswrapper[4935]: E1217 09:21:09.079430 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" podUID="e782899f-29fc-40c3-b7ef-41bfc66a221f" Dec 17 09:21:09 crc kubenswrapper[4935]: E1217 09:21:09.448579 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" podUID="e782899f-29fc-40c3-b7ef-41bfc66a221f" Dec 17 09:21:09 crc kubenswrapper[4935]: E1217 09:21:09.981601 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 17 09:21:09 crc kubenswrapper[4935]: E1217 09:21:09.981830 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nvzjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-95949466-vzv2z_openstack-operators(ff77991e-cda3-4547-878f-9a2785b3a9ab): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:09 crc kubenswrapper[4935]: E1217 09:21:09.983053 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" podUID="ff77991e-cda3-4547-878f-9a2785b3a9ab" Dec 17 09:21:10 crc kubenswrapper[4935]: E1217 09:21:10.455133 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" podUID="ff77991e-cda3-4547-878f-9a2785b3a9ab" Dec 17 09:21:10 crc kubenswrapper[4935]: E1217 09:21:10.759939 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027" Dec 17 09:21:10 crc kubenswrapper[4935]: E1217 09:21:10.760223 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gxmkx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-767f9d7567-94b2l_openstack-operators(97f9414d-21a1-41dd-a4b0-cccffa57d46a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:10 crc kubenswrapper[4935]: E1217 09:21:10.761444 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" podUID="97f9414d-21a1-41dd-a4b0-cccffa57d46a" Dec 17 09:21:11 crc kubenswrapper[4935]: E1217 09:21:11.460952 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:5370dc4a8e776923eec00bb50cbdb2e390e9dde50be26bdc04a216bd2d6b5027\\\"\"" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" podUID="97f9414d-21a1-41dd-a4b0-cccffa57d46a" Dec 17 09:21:11 crc kubenswrapper[4935]: E1217 09:21:11.504750 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87" Dec 17 09:21:11 crc kubenswrapper[4935]: E1217 09:21:11.505124 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-twhsk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-f458558d7-tq9rr_openstack-operators(8e14339f-174c-4065-8021-a3a8e56b7e16): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:11 crc kubenswrapper[4935]: E1217 09:21:11.506916 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" podUID="8e14339f-174c-4065-8021-a3a8e56b7e16" Dec 17 09:21:12 crc kubenswrapper[4935]: E1217 09:21:12.466178 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:5bdb3685be3ddc1efd62e16aaf2fa96ead64315e26d52b1b2a7d8ac01baa1e87\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" podUID="8e14339f-174c-4065-8021-a3a8e56b7e16" Dec 17 09:21:13 crc kubenswrapper[4935]: E1217 09:21:13.427365 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a" Dec 17 09:21:13 crc kubenswrapper[4935]: E1217 09:21:13.427630 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xlcbk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-5fdd9786f7-dvgdt_openstack-operators(9c2a1fd2-b473-40ad-873d-d4d9b79d5808): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:13 crc kubenswrapper[4935]: E1217 09:21:13.428849 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" podUID="9c2a1fd2-b473-40ad-873d-d4d9b79d5808" Dec 17 09:21:13 crc kubenswrapper[4935]: E1217 09:21:13.478085 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:44126f9c6b1d2bf752ddf989e20a4fc4cc1c07723d4fcb78465ccb2f55da6b3a\\\"\"" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" podUID="9c2a1fd2-b473-40ad-873d-d4d9b79d5808" Dec 17 09:21:14 crc kubenswrapper[4935]: E1217 09:21:14.525216 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a" Dec 17 09:21:14 crc kubenswrapper[4935]: E1217 09:21:14.526040 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-w4tj6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5f98b4754f-9dv4l_openstack-operators(83f649ce-a0cd-4405-9d8f-dee381d6f85a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:14 crc kubenswrapper[4935]: E1217 09:21:14.527433 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" podUID="83f649ce-a0cd-4405-9d8f-dee381d6f85a" Dec 17 09:21:15 crc kubenswrapper[4935]: E1217 09:21:15.489101 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:5639a8e1bbc8006cf0797de49b4c063c3531972e476c2257889bb66dac7fad8a\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" podUID="83f649ce-a0cd-4405-9d8f-dee381d6f85a" Dec 17 09:21:15 crc kubenswrapper[4935]: E1217 09:21:15.750916 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad" Dec 17 09:21:15 crc kubenswrapper[4935]: E1217 09:21:15.751249 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t8hb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-f76f4954c-9nnjm_openstack-operators(d8de4c04-1b17-45ca-9084-d69cd737bba2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:15 crc kubenswrapper[4935]: E1217 09:21:15.752431 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" podUID="d8de4c04-1b17-45ca-9084-d69cd737bba2" Dec 17 09:21:16 crc kubenswrapper[4935]: E1217 09:21:16.494296 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:424da951f13f1fbe9083215dc9f5088f90676dd813f01fdf3c1a8639b61cbaad\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" podUID="d8de4c04-1b17-45ca-9084-d69cd737bba2" Dec 17 09:21:16 crc kubenswrapper[4935]: E1217 09:21:16.651459 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991" Dec 17 09:21:16 crc kubenswrapper[4935]: E1217 09:21:16.651752 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sw92c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5c6df8f9-rk9ml_openstack-operators(f39def3f-b302-4d51-a636-752b4d23ded0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:16 crc kubenswrapper[4935]: E1217 09:21:16.653039 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" podUID="f39def3f-b302-4d51-a636-752b4d23ded0" Dec 17 09:21:17 crc kubenswrapper[4935]: E1217 09:21:17.500217 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3aa109bb973253ae9dcf339b9b65abbd1176cdb4be672c93e538a5f113816991\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" podUID="f39def3f-b302-4d51-a636-752b4d23ded0" Dec 17 09:21:23 crc kubenswrapper[4935]: I1217 09:21:23.029242 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:21:23 crc kubenswrapper[4935]: I1217 09:21:23.035509 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58b1e21c-930d-4c0c-9469-3e37fd64b23d-cert\") pod \"infra-operator-controller-manager-84b495f78-4g8bl\" (UID: \"58b1e21c-930d-4c0c-9469-3e37fd64b23d\") " pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:21:23 crc kubenswrapper[4935]: I1217 09:21:23.091666 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:21:23 crc kubenswrapper[4935]: I1217 09:21:23.232839 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:21:23 crc kubenswrapper[4935]: I1217 09:21:23.237625 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/361cd3f0-4302-4641-8b23-bfdb3904015f-cert\") pod \"openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9\" (UID: \"361cd3f0-4302-4641-8b23-bfdb3904015f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:21:23 crc kubenswrapper[4935]: I1217 09:21:23.392063 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:21:23 crc kubenswrapper[4935]: I1217 09:21:23.742488 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:21:23 crc kubenswrapper[4935]: I1217 09:21:23.742576 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:21:23 crc kubenswrapper[4935]: I1217 09:21:23.747665 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-webhook-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:21:23 crc kubenswrapper[4935]: I1217 09:21:23.748982 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cca50749-e7c9-4310-aaec-873208df4579-metrics-certs\") pod \"openstack-operator-controller-manager-5c897cfd74-2kxtc\" (UID: \"cca50749-e7c9-4310-aaec-873208df4579\") " pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:21:24 crc kubenswrapper[4935]: I1217 09:21:24.004766 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.020077 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.021046 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6x497,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-7cd87b778f-zpm62_openstack-operators(d7864302-210a-49dd-99ec-f33155990249): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.022299 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" podUID="d7864302-210a-49dd-99ec-f33155990249" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.032204 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.032854 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-59g5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-55f78b7c4c-6cvpq_openstack-operators(3f54488f-6b5c-458d-be0c-19b9248cc7b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.033984 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" podUID="3f54488f-6b5c-458d-be0c-19b9248cc7b1" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.508731 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.508956 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2kqm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5fbbf8b6cc-j59sf_openstack-operators(b44d3640-477b-4ab4-b514-9e8aa8f03fa4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.510326 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" podUID="b44d3640-477b-4ab4-b514-9e8aa8f03fa4" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.565145 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:961417d59f527d925ac48ff6a11de747d0493315e496e34dc83d76a1a1fff58a\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" podUID="3f54488f-6b5c-458d-be0c-19b9248cc7b1" Dec 17 09:21:25 crc kubenswrapper[4935]: E1217 09:21:25.566054 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" podUID="d7864302-210a-49dd-99ec-f33155990249" Dec 17 09:21:26 crc kubenswrapper[4935]: E1217 09:21:26.814970 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 17 09:21:26 crc kubenswrapper[4935]: E1217 09:21:26.815533 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gvvwl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8665b56d78-ndj2g_openstack-operators(08ebe1c7-8852-4b51-8042-cd2b26a5cf50): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:26 crc kubenswrapper[4935]: E1217 09:21:26.816769 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" podUID="08ebe1c7-8852-4b51-8042-cd2b26a5cf50" Dec 17 09:21:27 crc kubenswrapper[4935]: E1217 09:21:27.328481 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 17 09:21:27 crc kubenswrapper[4935]: E1217 09:21:27.328706 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4wzcr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bf6d4f946-5zhxw_openstack-operators(de693205-2ea2-4b43-aefc-5d4dbc8650d9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:27 crc kubenswrapper[4935]: E1217 09:21:27.330851 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" podUID="de693205-2ea2-4b43-aefc-5d4dbc8650d9" Dec 17 09:21:27 crc kubenswrapper[4935]: E1217 09:21:27.861063 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 17 09:21:27 crc kubenswrapper[4935]: E1217 09:21:27.861308 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dk6gt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-68c649d9d-hfklk_openstack-operators(adcbaa5e-9235-4fbd-9641-929c51d02d00): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:27 crc kubenswrapper[4935]: E1217 09:21:27.862663 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" podUID="adcbaa5e-9235-4fbd-9641-929c51d02d00" Dec 17 09:21:28 crc kubenswrapper[4935]: E1217 09:21:28.387743 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 17 09:21:28 crc kubenswrapper[4935]: E1217 09:21:28.388592 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5z7kj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xlq6d_openstack-operators(80038f1d-56db-4e70-91cf-3cec348298cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:21:28 crc kubenswrapper[4935]: E1217 09:21:28.389846 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" podUID="80038f1d-56db-4e70-91cf-3cec348298cc" Dec 17 09:21:28 crc kubenswrapper[4935]: I1217 09:21:28.922066 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl"] Dec 17 09:21:28 crc kubenswrapper[4935]: W1217 09:21:28.949856 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58b1e21c_930d_4c0c_9469_3e37fd64b23d.slice/crio-ade5b630a8f6d07ad2e18f3e09b966d8225c42e0ab353d2d5e5d2c63d16ba0b4 WatchSource:0}: Error finding container ade5b630a8f6d07ad2e18f3e09b966d8225c42e0ab353d2d5e5d2c63d16ba0b4: Status 404 returned error can't find the container with id ade5b630a8f6d07ad2e18f3e09b966d8225c42e0ab353d2d5e5d2c63d16ba0b4 Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.025529 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9"] Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.039226 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc"] Dec 17 09:21:29 crc kubenswrapper[4935]: W1217 09:21:29.064461 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod361cd3f0_4302_4641_8b23_bfdb3904015f.slice/crio-62fad404e16703d11fee3044dfbd1757f908cfb99383902a2a4a4b826348b79d WatchSource:0}: Error finding container 62fad404e16703d11fee3044dfbd1757f908cfb99383902a2a4a4b826348b79d: Status 404 returned error can't find the container with id 62fad404e16703d11fee3044dfbd1757f908cfb99383902a2a4a4b826348b79d Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.754057 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x" event={"ID":"bfc47d4c-a4db-4e06-ae0b-40afebb7c42e","Type":"ContainerStarted","Data":"a59c391487a18c2a7099ee52cf20f73285fa55a514027efac61907b488ed9d7e"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.761838 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.763519 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" event={"ID":"97f9414d-21a1-41dd-a4b0-cccffa57d46a","Type":"ContainerStarted","Data":"b8d231f74df0a3ec7c169b38c4e8ae3976022f75c695cf1a0d2243030154ae72"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.764254 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.774265 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" event={"ID":"83f649ce-a0cd-4405-9d8f-dee381d6f85a","Type":"ContainerStarted","Data":"e2269e688d54bedaf2fd608d16571a070dcb50de5722329edc7bfbc02319f99d"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.775725 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.779148 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" event={"ID":"b7549908-f751-4d15-bac6-e8ebcb550a55","Type":"ContainerStarted","Data":"4496af08de7002b6a16af532c6a4944aabd5b4de7c3215d9faf3692bfb708677"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.780455 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.783319 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb" event={"ID":"48d7ed73-e01d-48c1-98a4-22c4b3af76e3","Type":"ContainerStarted","Data":"3d079cc10d354eea36d5e9f147eb795bb96bd50c46c256375a9c92fed3c7c3df"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.783352 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.804158 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" event={"ID":"e782899f-29fc-40c3-b7ef-41bfc66a221f","Type":"ContainerStarted","Data":"2f0ae75058fed14106344335de59ef6d1e435eda0937ea3f453f3f643f70491d"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.804678 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.808502 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" event={"ID":"361cd3f0-4302-4641-8b23-bfdb3904015f","Type":"ContainerStarted","Data":"62fad404e16703d11fee3044dfbd1757f908cfb99383902a2a4a4b826348b79d"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.818292 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb" event={"ID":"dec986d2-14bc-4419-8e9c-9a6f4b1959d2","Type":"ContainerStarted","Data":"23ca5809dbb14170901e45d6ebf99915fb6cd961f5875b9166896537bc3c2650"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.818507 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.824639 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" podStartSLOduration=5.002874326 podStartE2EDuration="39.824616301s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:53.763722004 +0000 UTC m=+973.423562767" lastFinishedPulling="2025-12-17 09:21:28.585463979 +0000 UTC m=+1008.245304742" observedRunningTime="2025-12-17 09:21:29.819490757 +0000 UTC m=+1009.479331540" watchObservedRunningTime="2025-12-17 09:21:29.824616301 +0000 UTC m=+1009.484457064" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.826893 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x" podStartSLOduration=5.952909963 podStartE2EDuration="38.826885487s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.447361853 +0000 UTC m=+974.107202626" lastFinishedPulling="2025-12-17 09:21:27.321337387 +0000 UTC m=+1006.981178150" observedRunningTime="2025-12-17 09:21:29.797465699 +0000 UTC m=+1009.457306462" watchObservedRunningTime="2025-12-17 09:21:29.826885487 +0000 UTC m=+1009.486726250" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.831826 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" event={"ID":"cca50749-e7c9-4310-aaec-873208df4579","Type":"ContainerStarted","Data":"80864488f196c3d5c489b271bc6a735d39fd6c54449369484b2bd9357d94cc06"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.831877 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" event={"ID":"cca50749-e7c9-4310-aaec-873208df4579","Type":"ContainerStarted","Data":"36f57a45f43cac6c5a10384f4ad52049d19b9c8cdc4941e98edbe52f0fd67ab2"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.832023 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.835310 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" event={"ID":"ff77991e-cda3-4547-878f-9a2785b3a9ab","Type":"ContainerStarted","Data":"fbc7c4fde033a21c3d6b7e80c8f1e7da9883ea77ac55048c024812fbafd5cb95"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.835534 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.841659 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"b2296c1b96eef7533eacbd1e8dfbd023ac687a2c9a2c2ca41bf8bcc87a90001d"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.850495 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" event={"ID":"8e14339f-174c-4065-8021-a3a8e56b7e16","Type":"ContainerStarted","Data":"e03f4f51d44da7dfd033fd74203d4061939cf7401f0282e8bce609e1e9241608"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.851598 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.858446 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" event={"ID":"9c2a1fd2-b473-40ad-873d-d4d9b79d5808","Type":"ContainerStarted","Data":"745bfb137840eedf262a979e725626b38558c6592dae481e711cdc67e941a34e"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.859647 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.860911 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx" event={"ID":"9c887d00-51f3-4980-9b38-45f0d53780b8","Type":"ContainerStarted","Data":"e605ee8ae4b3d9472f80096271f027728d48bdc5f963ecf01d6bcf83b4163843"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.861492 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.879189 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" event={"ID":"58b1e21c-930d-4c0c-9469-3e37fd64b23d","Type":"ContainerStarted","Data":"ade5b630a8f6d07ad2e18f3e09b966d8225c42e0ab353d2d5e5d2c63d16ba0b4"} Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.900567 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" podStartSLOduration=5.153089362 podStartE2EDuration="39.900543804s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:53.768958882 +0000 UTC m=+973.428799655" lastFinishedPulling="2025-12-17 09:21:28.516413334 +0000 UTC m=+1008.176254097" observedRunningTime="2025-12-17 09:21:29.898075344 +0000 UTC m=+1009.557916107" watchObservedRunningTime="2025-12-17 09:21:29.900543804 +0000 UTC m=+1009.560384567" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.901378 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb" podStartSLOduration=6.539927727 podStartE2EDuration="39.901370974s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:53.958429844 +0000 UTC m=+973.618270607" lastFinishedPulling="2025-12-17 09:21:27.319873051 +0000 UTC m=+1006.979713854" observedRunningTime="2025-12-17 09:21:29.871217399 +0000 UTC m=+1009.531058162" watchObservedRunningTime="2025-12-17 09:21:29.901370974 +0000 UTC m=+1009.561211737" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.918908 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" podStartSLOduration=7.055463645 podStartE2EDuration="39.918886292s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.455958223 +0000 UTC m=+974.115798986" lastFinishedPulling="2025-12-17 09:21:27.31938087 +0000 UTC m=+1006.979221633" observedRunningTime="2025-12-17 09:21:29.917852776 +0000 UTC m=+1009.577693539" watchObservedRunningTime="2025-12-17 09:21:29.918886292 +0000 UTC m=+1009.578727055" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.950357 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" podStartSLOduration=5.544496791 podStartE2EDuration="39.950338379s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.096527724 +0000 UTC m=+973.756368487" lastFinishedPulling="2025-12-17 09:21:28.502369312 +0000 UTC m=+1008.162210075" observedRunningTime="2025-12-17 09:21:29.948671008 +0000 UTC m=+1009.608511771" watchObservedRunningTime="2025-12-17 09:21:29.950338379 +0000 UTC m=+1009.610179142" Dec 17 09:21:29 crc kubenswrapper[4935]: I1217 09:21:29.974064 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" podStartSLOduration=4.91786064 podStartE2EDuration="38.974042067s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.446558824 +0000 UTC m=+974.106399587" lastFinishedPulling="2025-12-17 09:21:28.502740241 +0000 UTC m=+1008.162581014" observedRunningTime="2025-12-17 09:21:29.967250031 +0000 UTC m=+1009.627090794" watchObservedRunningTime="2025-12-17 09:21:29.974042067 +0000 UTC m=+1009.633882830" Dec 17 09:21:30 crc kubenswrapper[4935]: I1217 09:21:30.013202 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" podStartSLOduration=5.499872582 podStartE2EDuration="40.013173172s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.068031448 +0000 UTC m=+973.727872201" lastFinishedPulling="2025-12-17 09:21:28.581332028 +0000 UTC m=+1008.241172791" observedRunningTime="2025-12-17 09:21:30.009344698 +0000 UTC m=+1009.669185461" watchObservedRunningTime="2025-12-17 09:21:30.013173172 +0000 UTC m=+1009.673013935" Dec 17 09:21:30 crc kubenswrapper[4935]: I1217 09:21:30.033751 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb" podStartSLOduration=7.6274054190000005 podStartE2EDuration="40.033733424s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:53.095838989 +0000 UTC m=+972.755679752" lastFinishedPulling="2025-12-17 09:21:25.502166994 +0000 UTC m=+1005.162007757" observedRunningTime="2025-12-17 09:21:30.028432504 +0000 UTC m=+1009.688273267" watchObservedRunningTime="2025-12-17 09:21:30.033733424 +0000 UTC m=+1009.693574187" Dec 17 09:21:30 crc kubenswrapper[4935]: I1217 09:21:30.052034 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" podStartSLOduration=5.575041127 podStartE2EDuration="40.05201024s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.117464325 +0000 UTC m=+973.777305088" lastFinishedPulling="2025-12-17 09:21:28.594433448 +0000 UTC m=+1008.254274201" observedRunningTime="2025-12-17 09:21:30.045586623 +0000 UTC m=+1009.705427386" watchObservedRunningTime="2025-12-17 09:21:30.05201024 +0000 UTC m=+1009.711851003" Dec 17 09:21:30 crc kubenswrapper[4935]: I1217 09:21:30.102813 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" podStartSLOduration=39.102788878 podStartE2EDuration="39.102788878s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:21:30.095863089 +0000 UTC m=+1009.755703862" watchObservedRunningTime="2025-12-17 09:21:30.102788878 +0000 UTC m=+1009.762629641" Dec 17 09:21:30 crc kubenswrapper[4935]: I1217 09:21:30.126066 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx" podStartSLOduration=7.253969619 podStartE2EDuration="40.126044826s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.447768344 +0000 UTC m=+974.107609107" lastFinishedPulling="2025-12-17 09:21:27.319843541 +0000 UTC m=+1006.979684314" observedRunningTime="2025-12-17 09:21:30.123230448 +0000 UTC m=+1009.783071221" watchObservedRunningTime="2025-12-17 09:21:30.126044826 +0000 UTC m=+1009.785885589" Dec 17 09:21:30 crc kubenswrapper[4935]: I1217 09:21:30.888114 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" event={"ID":"d8de4c04-1b17-45ca-9084-d69cd737bba2","Type":"ContainerStarted","Data":"3e66d761d096dc406b46f48a1c663334234ea00254e76ccc2e96818a4a754796"} Dec 17 09:21:30 crc kubenswrapper[4935]: I1217 09:21:30.921723 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" podStartSLOduration=5.056125946 podStartE2EDuration="40.921698618s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.026324511 +0000 UTC m=+973.686165274" lastFinishedPulling="2025-12-17 09:21:29.891897193 +0000 UTC m=+1009.551737946" observedRunningTime="2025-12-17 09:21:30.918161511 +0000 UTC m=+1010.578002264" watchObservedRunningTime="2025-12-17 09:21:30.921698618 +0000 UTC m=+1010.581539391" Dec 17 09:21:31 crc kubenswrapper[4935]: I1217 09:21:31.586345 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" Dec 17 09:21:34 crc kubenswrapper[4935]: I1217 09:21:34.155719 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5c897cfd74-2kxtc" Dec 17 09:21:36 crc kubenswrapper[4935]: I1217 09:21:36.044299 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" event={"ID":"f39def3f-b302-4d51-a636-752b4d23ded0","Type":"ContainerStarted","Data":"b36db1bf8cd6e47bb26e5ff1b67ba76f360cd635b5fbafdf90f500124b86ef25"} Dec 17 09:21:36 crc kubenswrapper[4935]: I1217 09:21:36.045500 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" Dec 17 09:21:36 crc kubenswrapper[4935]: I1217 09:21:36.047117 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" event={"ID":"58b1e21c-930d-4c0c-9469-3e37fd64b23d","Type":"ContainerStarted","Data":"321dee824ad99e241f61bcec7ff88cac0e3c44c22725897a49d829e43a2e1c17"} Dec 17 09:21:36 crc kubenswrapper[4935]: I1217 09:21:36.047259 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:21:36 crc kubenswrapper[4935]: I1217 09:21:36.048734 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" event={"ID":"361cd3f0-4302-4641-8b23-bfdb3904015f","Type":"ContainerStarted","Data":"e5fa6b73b9ab91d2bee3ab4ff751bd91c6d2296315158bb92cfb69758985b289"} Dec 17 09:21:36 crc kubenswrapper[4935]: I1217 09:21:36.048890 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:21:36 crc kubenswrapper[4935]: I1217 09:21:36.069836 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" podStartSLOduration=3.6594302069999998 podStartE2EDuration="45.069813449s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.075754397 +0000 UTC m=+973.735595160" lastFinishedPulling="2025-12-17 09:21:35.486137639 +0000 UTC m=+1015.145978402" observedRunningTime="2025-12-17 09:21:36.064829917 +0000 UTC m=+1015.724670690" watchObservedRunningTime="2025-12-17 09:21:36.069813449 +0000 UTC m=+1015.729654212" Dec 17 09:21:36 crc kubenswrapper[4935]: I1217 09:21:36.094633 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" podStartSLOduration=38.674234162 podStartE2EDuration="45.094608743s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="2025-12-17 09:21:29.070575185 +0000 UTC m=+1008.730415948" lastFinishedPulling="2025-12-17 09:21:35.490949756 +0000 UTC m=+1015.150790529" observedRunningTime="2025-12-17 09:21:36.092563954 +0000 UTC m=+1015.752404717" watchObservedRunningTime="2025-12-17 09:21:36.094608743 +0000 UTC m=+1015.754449496" Dec 17 09:21:36 crc kubenswrapper[4935]: I1217 09:21:36.116072 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" podStartSLOduration=39.581150902 podStartE2EDuration="46.116051167s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:21:28.954758359 +0000 UTC m=+1008.614599122" lastFinishedPulling="2025-12-17 09:21:35.489658624 +0000 UTC m=+1015.149499387" observedRunningTime="2025-12-17 09:21:36.108358699 +0000 UTC m=+1015.768199482" watchObservedRunningTime="2025-12-17 09:21:36.116051167 +0000 UTC m=+1015.775891940" Dec 17 09:21:37 crc kubenswrapper[4935]: I1217 09:21:37.057894 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" event={"ID":"d7864302-210a-49dd-99ec-f33155990249","Type":"ContainerStarted","Data":"74049037b65917f614b49815b1ea9b5b4163b679d9ff6c20f6d04ab205f2ebab"} Dec 17 09:21:37 crc kubenswrapper[4935]: I1217 09:21:37.085644 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" podStartSLOduration=4.853065251 podStartE2EDuration="47.085624362s" podCreationTimestamp="2025-12-17 09:20:50 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.4816612 +0000 UTC m=+974.141501963" lastFinishedPulling="2025-12-17 09:21:36.714220311 +0000 UTC m=+1016.374061074" observedRunningTime="2025-12-17 09:21:37.079554454 +0000 UTC m=+1016.739395217" watchObservedRunningTime="2025-12-17 09:21:37.085624362 +0000 UTC m=+1016.745465115" Dec 17 09:21:37 crc kubenswrapper[4935]: E1217 09:21:37.126616 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" podUID="b44d3640-477b-4ab4-b514-9e8aa8f03fa4" Dec 17 09:21:40 crc kubenswrapper[4935]: E1217 09:21:40.126853 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" podUID="08ebe1c7-8852-4b51-8042-cd2b26a5cf50" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.089660 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" event={"ID":"3f54488f-6b5c-458d-be0c-19b9248cc7b1","Type":"ContainerStarted","Data":"aa53832f6916ec9637e849bbe211897ce1f8749c7c96e0a0ab5337cb269f6ad9"} Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.090283 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.112916 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" podStartSLOduration=3.933712169 podStartE2EDuration="50.112891689s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.578484752 +0000 UTC m=+974.238325515" lastFinishedPulling="2025-12-17 09:21:40.757664272 +0000 UTC m=+1020.417505035" observedRunningTime="2025-12-17 09:21:41.108566122 +0000 UTC m=+1020.768406915" watchObservedRunningTime="2025-12-17 09:21:41.112891689 +0000 UTC m=+1020.772732452" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.119591 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-95949466-vzv2z" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.119696 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5f98b4754f-9dv4l" Dec 17 09:21:41 crc kubenswrapper[4935]: E1217 09:21:41.125116 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" podUID="80038f1d-56db-4e70-91cf-3cec348298cc" Dec 17 09:21:41 crc kubenswrapper[4935]: E1217 09:21:41.126245 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" podUID="de693205-2ea2-4b43-aefc-5d4dbc8650d9" Dec 17 09:21:41 crc kubenswrapper[4935]: E1217 09:21:41.126332 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" podUID="adcbaa5e-9235-4fbd-9641-929c51d02d00" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.176025 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-767f9d7567-94b2l" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.201354 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6ccf486b9-6jdgb" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.216158 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-59b8dcb766-vwxjb" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.268259 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-f458558d7-tq9rr" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.407243 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-5fdd9786f7-dvgdt" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.522922 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-f76f4954c-9nnjm" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.528840 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.531239 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-7cd87b778f-zpm62" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.701587 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5c6df8f9-rk9ml" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.747777 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5c7cbf548f-f9sfx" Dec 17 09:21:41 crc kubenswrapper[4935]: I1217 09:21:41.762524 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-97d456b9-2kr5x" Dec 17 09:21:42 crc kubenswrapper[4935]: I1217 09:21:42.048043 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-756ccf86c7-mnxj7" Dec 17 09:21:42 crc kubenswrapper[4935]: I1217 09:21:42.120531 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66f8b87655-2t5dm" Dec 17 09:21:43 crc kubenswrapper[4935]: I1217 09:21:43.098796 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-84b495f78-4g8bl" Dec 17 09:21:43 crc kubenswrapper[4935]: I1217 09:21:43.401399 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9" Dec 17 09:21:49 crc kubenswrapper[4935]: I1217 09:21:49.127639 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 09:21:51 crc kubenswrapper[4935]: I1217 09:21:51.168872 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" event={"ID":"b44d3640-477b-4ab4-b514-9e8aa8f03fa4","Type":"ContainerStarted","Data":"d136997797876fa7ce32db6c94f68ead4c523adc09abcc2026c7e99b95515dbd"} Dec 17 09:21:51 crc kubenswrapper[4935]: I1217 09:21:51.169523 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" Dec 17 09:21:51 crc kubenswrapper[4935]: I1217 09:21:51.188163 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" podStartSLOduration=4.3130766640000004 podStartE2EDuration="1m0.188142538s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.5558313 +0000 UTC m=+974.215672063" lastFinishedPulling="2025-12-17 09:21:50.430897174 +0000 UTC m=+1030.090737937" observedRunningTime="2025-12-17 09:21:51.183068915 +0000 UTC m=+1030.842909678" watchObservedRunningTime="2025-12-17 09:21:51.188142538 +0000 UTC m=+1030.847983301" Dec 17 09:21:52 crc kubenswrapper[4935]: I1217 09:21:52.094566 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-55f78b7c4c-6cvpq" Dec 17 09:21:53 crc kubenswrapper[4935]: I1217 09:21:53.183683 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" event={"ID":"08ebe1c7-8852-4b51-8042-cd2b26a5cf50","Type":"ContainerStarted","Data":"92daacfeada8145a8e0c87cb59744babac464b3263ce50e25277d58d219a1c61"} Dec 17 09:21:53 crc kubenswrapper[4935]: I1217 09:21:53.184177 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" Dec 17 09:21:53 crc kubenswrapper[4935]: I1217 09:21:53.198929 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" podStartSLOduration=4.10262465 podStartE2EDuration="1m2.198906986s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.543938689 +0000 UTC m=+974.203779452" lastFinishedPulling="2025-12-17 09:21:52.640221025 +0000 UTC m=+1032.300061788" observedRunningTime="2025-12-17 09:21:53.197805269 +0000 UTC m=+1032.857646052" watchObservedRunningTime="2025-12-17 09:21:53.198906986 +0000 UTC m=+1032.858747749" Dec 17 09:21:55 crc kubenswrapper[4935]: I1217 09:21:55.211580 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" event={"ID":"adcbaa5e-9235-4fbd-9641-929c51d02d00","Type":"ContainerStarted","Data":"6b54f3b132679f5157c54b1463a2ed6a965a5c775244a39879eb57bdf8190441"} Dec 17 09:21:55 crc kubenswrapper[4935]: I1217 09:21:55.213231 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" Dec 17 09:21:55 crc kubenswrapper[4935]: I1217 09:21:55.214782 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" event={"ID":"80038f1d-56db-4e70-91cf-3cec348298cc","Type":"ContainerStarted","Data":"f6afc061b07ceeaf6e93abe6adb3aeaceeb633fae685b0e9dda866638c4ba84a"} Dec 17 09:21:55 crc kubenswrapper[4935]: I1217 09:21:55.349610 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" podStartSLOduration=4.461467715 podStartE2EDuration="1m4.349589407s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.481756162 +0000 UTC m=+974.141596915" lastFinishedPulling="2025-12-17 09:21:54.369877854 +0000 UTC m=+1034.029718607" observedRunningTime="2025-12-17 09:21:55.345842085 +0000 UTC m=+1035.005682848" watchObservedRunningTime="2025-12-17 09:21:55.349589407 +0000 UTC m=+1035.009430170" Dec 17 09:21:55 crc kubenswrapper[4935]: I1217 09:21:55.366132 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xlq6d" podStartSLOduration=4.064601782 podStartE2EDuration="1m4.36611484s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.547494506 +0000 UTC m=+974.207335269" lastFinishedPulling="2025-12-17 09:21:54.849007524 +0000 UTC m=+1034.508848327" observedRunningTime="2025-12-17 09:21:55.363035725 +0000 UTC m=+1035.022876488" watchObservedRunningTime="2025-12-17 09:21:55.36611484 +0000 UTC m=+1035.025955603" Dec 17 09:21:57 crc kubenswrapper[4935]: I1217 09:21:57.229933 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" event={"ID":"de693205-2ea2-4b43-aefc-5d4dbc8650d9","Type":"ContainerStarted","Data":"8e7f8596dd5b522c07caa23e2d1181ab14989d5b4b6fbad96aa3bce6ea8db033"} Dec 17 09:21:57 crc kubenswrapper[4935]: I1217 09:21:57.230669 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" Dec 17 09:22:01 crc kubenswrapper[4935]: I1217 09:22:01.539418 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5fbbf8b6cc-j59sf" Dec 17 09:22:01 crc kubenswrapper[4935]: I1217 09:22:01.556998 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" podStartSLOduration=8.814327462 podStartE2EDuration="1m10.556975889s" podCreationTimestamp="2025-12-17 09:20:51 +0000 UTC" firstStartedPulling="2025-12-17 09:20:54.483041624 +0000 UTC m=+974.142882387" lastFinishedPulling="2025-12-17 09:21:56.225690061 +0000 UTC m=+1035.885530814" observedRunningTime="2025-12-17 09:21:57.257650017 +0000 UTC m=+1036.917490800" watchObservedRunningTime="2025-12-17 09:22:01.556975889 +0000 UTC m=+1041.216816652" Dec 17 09:22:01 crc kubenswrapper[4935]: I1217 09:22:01.579609 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-68c649d9d-hfklk" Dec 17 09:22:01 crc kubenswrapper[4935]: I1217 09:22:01.675883 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8665b56d78-ndj2g" Dec 17 09:22:01 crc kubenswrapper[4935]: I1217 09:22:01.678089 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-5zhxw" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.173898 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-qc6ml"] Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.177425 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.182893 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.183231 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.184164 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-xf88t" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.187257 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.189987 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-qc6ml"] Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.278306 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5f25f"] Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.279795 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.284594 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzjg5\" (UniqueName: \"kubernetes.io/projected/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-kube-api-access-mzjg5\") pod \"dnsmasq-dns-84bb9d8bd9-qc6ml\" (UID: \"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.284685 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-config\") pod \"dnsmasq-dns-84bb9d8bd9-qc6ml\" (UID: \"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.287056 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.289589 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5f25f"] Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.385847 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-config\") pod \"dnsmasq-dns-5f854695bc-5f25f\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.385905 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27vqd\" (UniqueName: \"kubernetes.io/projected/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-kube-api-access-27vqd\") pod \"dnsmasq-dns-5f854695bc-5f25f\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.385959 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-dns-svc\") pod \"dnsmasq-dns-5f854695bc-5f25f\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.385988 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzjg5\" (UniqueName: \"kubernetes.io/projected/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-kube-api-access-mzjg5\") pod \"dnsmasq-dns-84bb9d8bd9-qc6ml\" (UID: \"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.386023 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-config\") pod \"dnsmasq-dns-84bb9d8bd9-qc6ml\" (UID: \"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.387004 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-config\") pod \"dnsmasq-dns-84bb9d8bd9-qc6ml\" (UID: \"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.410025 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzjg5\" (UniqueName: \"kubernetes.io/projected/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-kube-api-access-mzjg5\") pod \"dnsmasq-dns-84bb9d8bd9-qc6ml\" (UID: \"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.487457 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-dns-svc\") pod \"dnsmasq-dns-5f854695bc-5f25f\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.487607 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-config\") pod \"dnsmasq-dns-5f854695bc-5f25f\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.487645 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27vqd\" (UniqueName: \"kubernetes.io/projected/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-kube-api-access-27vqd\") pod \"dnsmasq-dns-5f854695bc-5f25f\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.488743 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-dns-svc\") pod \"dnsmasq-dns-5f854695bc-5f25f\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.488764 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-config\") pod \"dnsmasq-dns-5f854695bc-5f25f\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.500596 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.506878 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27vqd\" (UniqueName: \"kubernetes.io/projected/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-kube-api-access-27vqd\") pod \"dnsmasq-dns-5f854695bc-5f25f\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.599899 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:22 crc kubenswrapper[4935]: I1217 09:22:22.985111 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-qc6ml"] Dec 17 09:22:22 crc kubenswrapper[4935]: W1217 09:22:22.995919 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f081489_2ac9_49cb_baf0_a8bd6b6adc3e.slice/crio-9cb8d2023e76708363076e4318c53021435c699f9ff3c4808c54f56fcbc04143 WatchSource:0}: Error finding container 9cb8d2023e76708363076e4318c53021435c699f9ff3c4808c54f56fcbc04143: Status 404 returned error can't find the container with id 9cb8d2023e76708363076e4318c53021435c699f9ff3c4808c54f56fcbc04143 Dec 17 09:22:23 crc kubenswrapper[4935]: I1217 09:22:23.085014 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5f25f"] Dec 17 09:22:23 crc kubenswrapper[4935]: I1217 09:22:23.456874 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-5f25f" event={"ID":"1bf9e79a-9bd6-408c-bd89-c483ea7cae69","Type":"ContainerStarted","Data":"0e7924c91cfe9403a8482b629cc45f2522e134e449e81d8395dba5a450f1b8d8"} Dec 17 09:22:23 crc kubenswrapper[4935]: I1217 09:22:23.459208 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" event={"ID":"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e","Type":"ContainerStarted","Data":"9cb8d2023e76708363076e4318c53021435c699f9ff3c4808c54f56fcbc04143"} Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.586530 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5f25f"] Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.602952 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-2f4wh"] Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.605803 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.623841 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-2f4wh"] Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.717970 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-2f4wh\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.718053 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-config\") pod \"dnsmasq-dns-744ffd65bc-2f4wh\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.718096 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzj5m\" (UniqueName: \"kubernetes.io/projected/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-kube-api-access-tzj5m\") pod \"dnsmasq-dns-744ffd65bc-2f4wh\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.819684 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-2f4wh\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.821849 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-config\") pod \"dnsmasq-dns-744ffd65bc-2f4wh\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.821934 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzj5m\" (UniqueName: \"kubernetes.io/projected/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-kube-api-access-tzj5m\") pod \"dnsmasq-dns-744ffd65bc-2f4wh\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.822691 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-config\") pod \"dnsmasq-dns-744ffd65bc-2f4wh\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.829758 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-2f4wh\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.913262 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzj5m\" (UniqueName: \"kubernetes.io/projected/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-kube-api-access-tzj5m\") pod \"dnsmasq-dns-744ffd65bc-2f4wh\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.958693 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-qc6ml"] Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.980227 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.991476 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-v92mg"] Dec 17 09:22:25 crc kubenswrapper[4935]: I1217 09:22:25.992971 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.003519 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-v92mg"] Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.027848 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-config\") pod \"dnsmasq-dns-95f5f6995-v92mg\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.027901 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-dns-svc\") pod \"dnsmasq-dns-95f5f6995-v92mg\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.027925 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgrpv\" (UniqueName: \"kubernetes.io/projected/f10b4928-6f26-4d19-af69-7a924b448297-kube-api-access-sgrpv\") pod \"dnsmasq-dns-95f5f6995-v92mg\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.130753 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-config\") pod \"dnsmasq-dns-95f5f6995-v92mg\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.131437 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-dns-svc\") pod \"dnsmasq-dns-95f5f6995-v92mg\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.131512 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgrpv\" (UniqueName: \"kubernetes.io/projected/f10b4928-6f26-4d19-af69-7a924b448297-kube-api-access-sgrpv\") pod \"dnsmasq-dns-95f5f6995-v92mg\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.131970 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-config\") pod \"dnsmasq-dns-95f5f6995-v92mg\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.132727 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-dns-svc\") pod \"dnsmasq-dns-95f5f6995-v92mg\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.218985 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgrpv\" (UniqueName: \"kubernetes.io/projected/f10b4928-6f26-4d19-af69-7a924b448297-kube-api-access-sgrpv\") pod \"dnsmasq-dns-95f5f6995-v92mg\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.314367 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.789165 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.790711 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.794101 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.794327 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.794425 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mzrr5" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.794183 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.794230 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.794240 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.794427 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.832813 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-2f4wh"] Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.844083 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 17 09:22:26 crc kubenswrapper[4935]: W1217 09:22:26.857673 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf25c22f0_aa41_47d9_ab2b_e3cdc9394f53.slice/crio-a5d127977ad0f469f05de017b842b8727db5327761a725bab9a508705a0835ab WatchSource:0}: Error finding container a5d127977ad0f469f05de017b842b8727db5327761a725bab9a508705a0835ab: Status 404 returned error can't find the container with id a5d127977ad0f469f05de017b842b8727db5327761a725bab9a508705a0835ab Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.872432 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nx7l\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-kube-api-access-5nx7l\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.872497 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.872525 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.872562 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c863e0e-b041-4e68-852f-addc7126a215-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.872592 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-config-data\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.872696 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.872742 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.873055 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c863e0e-b041-4e68-852f-addc7126a215-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.873086 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.873119 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.873163 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.876147 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-v92mg"] Dec 17 09:22:26 crc kubenswrapper[4935]: W1217 09:22:26.909961 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10b4928_6f26_4d19_af69_7a924b448297.slice/crio-8904d9861040c02456e4c850abfb2cb697bfeaf0ffffa773414300afca41c651 WatchSource:0}: Error finding container 8904d9861040c02456e4c850abfb2cb697bfeaf0ffffa773414300afca41c651: Status 404 returned error can't find the container with id 8904d9861040c02456e4c850abfb2cb697bfeaf0ffffa773414300afca41c651 Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.974887 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c863e0e-b041-4e68-852f-addc7126a215-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.974930 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.974985 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.975030 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.975080 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nx7l\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-kube-api-access-5nx7l\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.975122 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.975144 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.975175 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c863e0e-b041-4e68-852f-addc7126a215-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.975201 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-config-data\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.975231 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.975253 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.975597 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.976531 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.976887 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.977085 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-config-data\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.980702 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.980940 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-server-conf\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.981474 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.983342 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c863e0e-b041-4e68-852f-addc7126a215-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.986857 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c863e0e-b041-4e68-852f-addc7126a215-pod-info\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:26 crc kubenswrapper[4935]: I1217 09:22:26.996090 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nx7l\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-kube-api-access-5nx7l\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.000338 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.001363 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " pod="openstack/rabbitmq-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.130854 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.146036 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.153641 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.153773 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.156705 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.157209 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.157567 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.157845 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.158422 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xcs2m" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.158513 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.158951 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.279608 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8mx\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-kube-api-access-fc8mx\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.279681 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.279801 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.279916 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.279944 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.279973 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.280028 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.280127 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.280347 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90598781-7630-4807-9735-eb1cfaba2927-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.280376 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90598781-7630-4807-9735-eb1cfaba2927-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.280397 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409194 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8mx\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-kube-api-access-fc8mx\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409320 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409414 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409489 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409540 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409585 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409621 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409677 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409758 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90598781-7630-4807-9735-eb1cfaba2927-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409778 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90598781-7630-4807-9735-eb1cfaba2927-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.409826 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.411211 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.415735 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.416980 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.417216 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.417547 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.418134 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.418249 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.421478 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90598781-7630-4807-9735-eb1cfaba2927-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.421902 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90598781-7630-4807-9735-eb1cfaba2927-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.425029 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.437102 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8mx\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-kube-api-access-fc8mx\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.458174 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.523262 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" event={"ID":"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53","Type":"ContainerStarted","Data":"a5d127977ad0f469f05de017b842b8727db5327761a725bab9a508705a0835ab"} Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.524945 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-v92mg" event={"ID":"f10b4928-6f26-4d19-af69-7a924b448297","Type":"ContainerStarted","Data":"8904d9861040c02456e4c850abfb2cb697bfeaf0ffffa773414300afca41c651"} Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.529590 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.732506 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 17 09:22:27 crc kubenswrapper[4935]: W1217 09:22:27.750035 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c863e0e_b041_4e68_852f_addc7126a215.slice/crio-017fef6e1dce470c680e41b49064b5b026ba6c89c257cc89fac90912886864bf WatchSource:0}: Error finding container 017fef6e1dce470c680e41b49064b5b026ba6c89c257cc89fac90912886864bf: Status 404 returned error can't find the container with id 017fef6e1dce470c680e41b49064b5b026ba6c89c257cc89fac90912886864bf Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.989764 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.992534 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.995933 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.996512 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.996692 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-cq7sn" Dec 17 09:22:27 crc kubenswrapper[4935]: I1217 09:22:27.996871 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.009253 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.010125 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.160620 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-kolla-config\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.160680 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-config-data-default\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.160729 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.160791 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.160814 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.160856 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.160871 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.160932 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crcmc\" (UniqueName: \"kubernetes.io/projected/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-kube-api-access-crcmc\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.201821 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.263070 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-kolla-config\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.263136 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-config-data-default\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.263211 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.263259 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.263304 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.263334 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.263349 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.263389 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crcmc\" (UniqueName: \"kubernetes.io/projected/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-kube-api-access-crcmc\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.265298 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.278230 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-kolla-config\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.278578 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-config-data-default\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.283298 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.284019 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-operator-scripts\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.284312 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.287886 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-config-data-generated\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.291421 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crcmc\" (UniqueName: \"kubernetes.io/projected/d3d1b63f-c619-4973-bad8-c90d12ccbbe1-kube-api-access-crcmc\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.346448 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"d3d1b63f-c619-4973-bad8-c90d12ccbbe1\") " pod="openstack/openstack-galera-0" Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.556671 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c863e0e-b041-4e68-852f-addc7126a215","Type":"ContainerStarted","Data":"017fef6e1dce470c680e41b49064b5b026ba6c89c257cc89fac90912886864bf"} Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.558131 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90598781-7630-4807-9735-eb1cfaba2927","Type":"ContainerStarted","Data":"d8ee6bb81e6432a3b79cbfa5a86e4667399eb978152e3ad2b37754fdf8f9c6b8"} Dec 17 09:22:28 crc kubenswrapper[4935]: I1217 09:22:28.621571 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.345687 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.348641 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.353436 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.353598 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.353812 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.354035 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-4pcvg" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.368227 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.400618 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/594360f1-b2e1-4e64-82d5-f6e471cd6850-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.400688 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/594360f1-b2e1-4e64-82d5-f6e471cd6850-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.400711 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/594360f1-b2e1-4e64-82d5-f6e471cd6850-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.400729 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlxtp\" (UniqueName: \"kubernetes.io/projected/594360f1-b2e1-4e64-82d5-f6e471cd6850-kube-api-access-nlxtp\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.400785 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.400815 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594360f1-b2e1-4e64-82d5-f6e471cd6850-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.400846 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/594360f1-b2e1-4e64-82d5-f6e471cd6850-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.400876 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/594360f1-b2e1-4e64-82d5-f6e471cd6850-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.490521 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.496648 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.498715 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504106 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlxtp\" (UniqueName: \"kubernetes.io/projected/594360f1-b2e1-4e64-82d5-f6e471cd6850-kube-api-access-nlxtp\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504178 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504226 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594360f1-b2e1-4e64-82d5-f6e471cd6850-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504259 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/594360f1-b2e1-4e64-82d5-f6e471cd6850-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504302 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/594360f1-b2e1-4e64-82d5-f6e471cd6850-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504328 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/594360f1-b2e1-4e64-82d5-f6e471cd6850-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504367 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/594360f1-b2e1-4e64-82d5-f6e471cd6850-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504385 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/594360f1-b2e1-4e64-82d5-f6e471cd6850-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504557 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504700 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xcrfv" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.504943 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.505233 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.505352 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/594360f1-b2e1-4e64-82d5-f6e471cd6850-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.506100 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/594360f1-b2e1-4e64-82d5-f6e471cd6850-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.506347 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/594360f1-b2e1-4e64-82d5-f6e471cd6850-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.520941 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/594360f1-b2e1-4e64-82d5-f6e471cd6850-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.524812 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/594360f1-b2e1-4e64-82d5-f6e471cd6850-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.532565 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/594360f1-b2e1-4e64-82d5-f6e471cd6850-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.548498 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlxtp\" (UniqueName: \"kubernetes.io/projected/594360f1-b2e1-4e64-82d5-f6e471cd6850-kube-api-access-nlxtp\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.579426 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-cell1-galera-0\" (UID: \"594360f1-b2e1-4e64-82d5-f6e471cd6850\") " pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.606782 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-config-data\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.606895 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.606926 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xjd7\" (UniqueName: \"kubernetes.io/projected/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-kube-api-access-7xjd7\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.606955 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-kolla-config\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.606974 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.685640 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.714072 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-config-data\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.714180 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.714203 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjd7\" (UniqueName: \"kubernetes.io/projected/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-kube-api-access-7xjd7\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.714226 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.714242 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-kolla-config\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.716159 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-config-data\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.720166 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.721625 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-kolla-config\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.733046 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.740187 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xjd7\" (UniqueName: \"kubernetes.io/projected/f4019f7f-3fa3-4d4d-976b-b81f43530f0e-kube-api-access-7xjd7\") pod \"memcached-0\" (UID: \"f4019f7f-3fa3-4d4d-976b-b81f43530f0e\") " pod="openstack/memcached-0" Dec 17 09:22:29 crc kubenswrapper[4935]: I1217 09:22:29.940899 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 17 09:22:31 crc kubenswrapper[4935]: I1217 09:22:31.653513 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 17 09:22:31 crc kubenswrapper[4935]: I1217 09:22:31.654750 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 17 09:22:31 crc kubenswrapper[4935]: I1217 09:22:31.660604 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wh2cd" Dec 17 09:22:31 crc kubenswrapper[4935]: I1217 09:22:31.669013 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 17 09:22:31 crc kubenswrapper[4935]: I1217 09:22:31.773318 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdnfn\" (UniqueName: \"kubernetes.io/projected/4a4755e9-435f-4abd-b10f-1f55f1bc4d17-kube-api-access-zdnfn\") pod \"kube-state-metrics-0\" (UID: \"4a4755e9-435f-4abd-b10f-1f55f1bc4d17\") " pod="openstack/kube-state-metrics-0" Dec 17 09:22:31 crc kubenswrapper[4935]: I1217 09:22:31.890781 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdnfn\" (UniqueName: \"kubernetes.io/projected/4a4755e9-435f-4abd-b10f-1f55f1bc4d17-kube-api-access-zdnfn\") pod \"kube-state-metrics-0\" (UID: \"4a4755e9-435f-4abd-b10f-1f55f1bc4d17\") " pod="openstack/kube-state-metrics-0" Dec 17 09:22:31 crc kubenswrapper[4935]: I1217 09:22:31.947864 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdnfn\" (UniqueName: \"kubernetes.io/projected/4a4755e9-435f-4abd-b10f-1f55f1bc4d17-kube-api-access-zdnfn\") pod \"kube-state-metrics-0\" (UID: \"4a4755e9-435f-4abd-b10f-1f55f1bc4d17\") " pod="openstack/kube-state-metrics-0" Dec 17 09:22:31 crc kubenswrapper[4935]: I1217 09:22:31.989444 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 17 09:22:34 crc kubenswrapper[4935]: I1217 09:22:34.894552 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-l4zph"] Dec 17 09:22:34 crc kubenswrapper[4935]: I1217 09:22:34.897631 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l4zph" Dec 17 09:22:34 crc kubenswrapper[4935]: I1217 09:22:34.909068 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-6wvvv" Dec 17 09:22:34 crc kubenswrapper[4935]: I1217 09:22:34.909340 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 17 09:22:34 crc kubenswrapper[4935]: I1217 09:22:34.909970 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 17 09:22:34 crc kubenswrapper[4935]: I1217 09:22:34.952055 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-jhrvs"] Dec 17 09:22:34 crc kubenswrapper[4935]: I1217 09:22:34.955188 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:34 crc kubenswrapper[4935]: I1217 09:22:34.959252 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l4zph"] Dec 17 09:22:34 crc kubenswrapper[4935]: I1217 09:22:34.979749 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jhrvs"] Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044398 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1e01a1b-9baa-4738-8e44-b206863b4d3d-var-log-ovn\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044464 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-scripts\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044488 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j4kd\" (UniqueName: \"kubernetes.io/projected/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-kube-api-access-5j4kd\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044523 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1e01a1b-9baa-4738-8e44-b206863b4d3d-var-run\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044540 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-etc-ovs\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044556 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-var-lib\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044578 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w4t4\" (UniqueName: \"kubernetes.io/projected/c1e01a1b-9baa-4738-8e44-b206863b4d3d-kube-api-access-7w4t4\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044604 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1e01a1b-9baa-4738-8e44-b206863b4d3d-var-run-ovn\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044697 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e01a1b-9baa-4738-8e44-b206863b4d3d-ovn-controller-tls-certs\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044724 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-var-run\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044785 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e01a1b-9baa-4738-8e44-b206863b4d3d-combined-ca-bundle\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044846 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-var-log\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.044871 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e01a1b-9baa-4738-8e44-b206863b4d3d-scripts\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146458 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e01a1b-9baa-4738-8e44-b206863b4d3d-scripts\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146570 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1e01a1b-9baa-4738-8e44-b206863b4d3d-var-log-ovn\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146633 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-scripts\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146660 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j4kd\" (UniqueName: \"kubernetes.io/projected/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-kube-api-access-5j4kd\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146690 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1e01a1b-9baa-4738-8e44-b206863b4d3d-var-run\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146706 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-etc-ovs\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146723 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-var-lib\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146750 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w4t4\" (UniqueName: \"kubernetes.io/projected/c1e01a1b-9baa-4738-8e44-b206863b4d3d-kube-api-access-7w4t4\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146779 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1e01a1b-9baa-4738-8e44-b206863b4d3d-var-run-ovn\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146810 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e01a1b-9baa-4738-8e44-b206863b4d3d-ovn-controller-tls-certs\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146839 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-var-run\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146872 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e01a1b-9baa-4738-8e44-b206863b4d3d-combined-ca-bundle\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.146930 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-var-log\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.147410 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-var-log\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.147545 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c1e01a1b-9baa-4738-8e44-b206863b4d3d-var-run\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.147570 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c1e01a1b-9baa-4738-8e44-b206863b4d3d-var-log-ovn\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.147601 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c1e01a1b-9baa-4738-8e44-b206863b4d3d-var-run-ovn\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.147659 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-var-run\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.147764 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-var-lib\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.147908 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-etc-ovs\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.149883 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-scripts\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.154708 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1e01a1b-9baa-4738-8e44-b206863b4d3d-combined-ca-bundle\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.157776 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1e01a1b-9baa-4738-8e44-b206863b4d3d-scripts\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.159556 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c1e01a1b-9baa-4738-8e44-b206863b4d3d-ovn-controller-tls-certs\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.166947 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w4t4\" (UniqueName: \"kubernetes.io/projected/c1e01a1b-9baa-4738-8e44-b206863b4d3d-kube-api-access-7w4t4\") pod \"ovn-controller-l4zph\" (UID: \"c1e01a1b-9baa-4738-8e44-b206863b4d3d\") " pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.168171 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j4kd\" (UniqueName: \"kubernetes.io/projected/dcc296df-d8ab-4200-ae5f-3ecf58c614b1-kube-api-access-5j4kd\") pod \"ovn-controller-ovs-jhrvs\" (UID: \"dcc296df-d8ab-4200-ae5f-3ecf58c614b1\") " pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.229871 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l4zph" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.273359 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.297400 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.298855 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.302406 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.302744 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.302767 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.302795 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.304463 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-h5m6f" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.317290 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.607429 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f8af4b-aed3-46a8-84a0-aeae265a1309-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.607781 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.607802 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4f8af4b-aed3-46a8-84a0-aeae265a1309-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.607863 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5b8f\" (UniqueName: \"kubernetes.io/projected/a4f8af4b-aed3-46a8-84a0-aeae265a1309-kube-api-access-b5b8f\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.607902 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a4f8af4b-aed3-46a8-84a0-aeae265a1309-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.607927 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4f8af4b-aed3-46a8-84a0-aeae265a1309-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.607946 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f8af4b-aed3-46a8-84a0-aeae265a1309-config\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.608014 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4f8af4b-aed3-46a8-84a0-aeae265a1309-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.709575 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f8af4b-aed3-46a8-84a0-aeae265a1309-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.709649 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.709671 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4f8af4b-aed3-46a8-84a0-aeae265a1309-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.709750 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5b8f\" (UniqueName: \"kubernetes.io/projected/a4f8af4b-aed3-46a8-84a0-aeae265a1309-kube-api-access-b5b8f\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.709788 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a4f8af4b-aed3-46a8-84a0-aeae265a1309-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.709813 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4f8af4b-aed3-46a8-84a0-aeae265a1309-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.709829 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f8af4b-aed3-46a8-84a0-aeae265a1309-config\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.709849 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4f8af4b-aed3-46a8-84a0-aeae265a1309-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.710215 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.710856 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a4f8af4b-aed3-46a8-84a0-aeae265a1309-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.712449 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4f8af4b-aed3-46a8-84a0-aeae265a1309-config\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.712668 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a4f8af4b-aed3-46a8-84a0-aeae265a1309-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.716076 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4f8af4b-aed3-46a8-84a0-aeae265a1309-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.716512 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4f8af4b-aed3-46a8-84a0-aeae265a1309-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.716958 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4f8af4b-aed3-46a8-84a0-aeae265a1309-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.730908 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.816006 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5b8f\" (UniqueName: \"kubernetes.io/projected/a4f8af4b-aed3-46a8-84a0-aeae265a1309-kube-api-access-b5b8f\") pod \"ovsdbserver-nb-0\" (UID: \"a4f8af4b-aed3-46a8-84a0-aeae265a1309\") " pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:35 crc kubenswrapper[4935]: I1217 09:22:35.942054 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.431090 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.436930 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.441641 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.441853 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.441944 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.442118 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-ft4rs" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.453316 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.553549 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dca41fc9-68e2-4f42-88fe-942695deca13-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.553653 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca41fc9-68e2-4f42-88fe-942695deca13-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.553686 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca41fc9-68e2-4f42-88fe-942695deca13-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.553718 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dca41fc9-68e2-4f42-88fe-942695deca13-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.553743 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9pbx\" (UniqueName: \"kubernetes.io/projected/dca41fc9-68e2-4f42-88fe-942695deca13-kube-api-access-w9pbx\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.554299 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.554413 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca41fc9-68e2-4f42-88fe-942695deca13-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.554628 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dca41fc9-68e2-4f42-88fe-942695deca13-config\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.660401 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dca41fc9-68e2-4f42-88fe-942695deca13-config\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.660487 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dca41fc9-68e2-4f42-88fe-942695deca13-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.660535 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca41fc9-68e2-4f42-88fe-942695deca13-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.660580 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca41fc9-68e2-4f42-88fe-942695deca13-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.660610 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dca41fc9-68e2-4f42-88fe-942695deca13-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.660634 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9pbx\" (UniqueName: \"kubernetes.io/projected/dca41fc9-68e2-4f42-88fe-942695deca13-kube-api-access-w9pbx\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.660760 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.660795 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca41fc9-68e2-4f42-88fe-942695deca13-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.661220 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dca41fc9-68e2-4f42-88fe-942695deca13-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.661402 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.662728 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dca41fc9-68e2-4f42-88fe-942695deca13-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.665228 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dca41fc9-68e2-4f42-88fe-942695deca13-config\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.677060 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca41fc9-68e2-4f42-88fe-942695deca13-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.677708 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dca41fc9-68e2-4f42-88fe-942695deca13-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.677898 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dca41fc9-68e2-4f42-88fe-942695deca13-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.679664 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9pbx\" (UniqueName: \"kubernetes.io/projected/dca41fc9-68e2-4f42-88fe-942695deca13-kube-api-access-w9pbx\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.685141 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"dca41fc9-68e2-4f42-88fe-942695deca13\") " pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:39 crc kubenswrapper[4935]: I1217 09:22:39.775717 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.379888 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.380804 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-27vqd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f854695bc-5f25f_openstack(1bf9e79a-9bd6-408c-bd89-c483ea7cae69): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.382794 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f854695bc-5f25f" podUID="1bf9e79a-9bd6-408c-bd89-c483ea7cae69" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.409048 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.409352 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgrpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-95f5f6995-v92mg_openstack(f10b4928-6f26-4d19-af69-7a924b448297): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.413990 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-95f5f6995-v92mg" podUID="f10b4928-6f26-4d19-af69-7a924b448297" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.418631 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.418848 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tzj5m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-744ffd65bc-2f4wh_openstack(f25c22f0-aa41-47d9-ab2b-e3cdc9394f53): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.420502 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" podUID="f25c22f0-aa41-47d9-ab2b-e3cdc9394f53" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.485878 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.486063 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzjg5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84bb9d8bd9-qc6ml_openstack(1f081489-2ac9-49cb-baf0-a8bd6b6adc3e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.487358 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" podUID="1f081489-2ac9-49cb-baf0-a8bd6b6adc3e" Dec 17 09:22:51 crc kubenswrapper[4935]: I1217 09:22:51.890556 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 17 09:22:51 crc kubenswrapper[4935]: W1217 09:22:51.896847 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4019f7f_3fa3_4d4d_976b_b81f43530f0e.slice/crio-0abd481644827deb7b476e2735c3a443e742c215c74d9ab1b8a63515658dcd7a WatchSource:0}: Error finding container 0abd481644827deb7b476e2735c3a443e742c215c74d9ab1b8a63515658dcd7a: Status 404 returned error can't find the container with id 0abd481644827deb7b476e2735c3a443e742c215c74d9ab1b8a63515658dcd7a Dec 17 09:22:51 crc kubenswrapper[4935]: I1217 09:22:51.963606 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f4019f7f-3fa3-4d4d-976b-b81f43530f0e","Type":"ContainerStarted","Data":"0abd481644827deb7b476e2735c3a443e742c215c74d9ab1b8a63515658dcd7a"} Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.971421 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" podUID="f25c22f0-aa41-47d9-ab2b-e3cdc9394f53" Dec 17 09:22:51 crc kubenswrapper[4935]: E1217 09:22:51.971813 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-95f5f6995-v92mg" podUID="f10b4928-6f26-4d19-af69-7a924b448297" Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.029445 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.111469 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l4zph"] Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.172637 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jhrvs"] Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.202456 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.214196 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 17 09:22:52 crc kubenswrapper[4935]: W1217 09:22:52.254551 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1e01a1b_9baa_4738_8e44_b206863b4d3d.slice/crio-683cea6dbac3c4542ccaa018d544e4503d89d92644a54f04ea0c49345496e558 WatchSource:0}: Error finding container 683cea6dbac3c4542ccaa018d544e4503d89d92644a54f04ea0c49345496e558: Status 404 returned error can't find the container with id 683cea6dbac3c4542ccaa018d544e4503d89d92644a54f04ea0c49345496e558 Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.326291 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 17 09:22:52 crc kubenswrapper[4935]: W1217 09:22:52.360862 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d1b63f_c619_4973_bad8_c90d12ccbbe1.slice/crio-febabaacc2f3f8d58182d0800c1cb0af040ba9fceb6c5390d5f2b3efb3988dc7 WatchSource:0}: Error finding container febabaacc2f3f8d58182d0800c1cb0af040ba9fceb6c5390d5f2b3efb3988dc7: Status 404 returned error can't find the container with id febabaacc2f3f8d58182d0800c1cb0af040ba9fceb6c5390d5f2b3efb3988dc7 Dec 17 09:22:52 crc kubenswrapper[4935]: W1217 09:22:52.454062 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod594360f1_b2e1_4e64_82d5_f6e471cd6850.slice/crio-450e02498f0377b96ab94dadfc19bbe7f9d78998d50219d62ef9de96076765a3 WatchSource:0}: Error finding container 450e02498f0377b96ab94dadfc19bbe7f9d78998d50219d62ef9de96076765a3: Status 404 returned error can't find the container with id 450e02498f0377b96ab94dadfc19bbe7f9d78998d50219d62ef9de96076765a3 Dec 17 09:22:52 crc kubenswrapper[4935]: W1217 09:22:52.554683 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddca41fc9_68e2_4f42_88fe_942695deca13.slice/crio-a722e297fadb3b9e50950976a0fa5d006353dd617b04736def88abf4f05e414b WatchSource:0}: Error finding container a722e297fadb3b9e50950976a0fa5d006353dd617b04736def88abf4f05e414b: Status 404 returned error can't find the container with id a722e297fadb3b9e50950976a0fa5d006353dd617b04736def88abf4f05e414b Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.848259 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.856113 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.971031 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.971041 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-qc6ml" event={"ID":"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e","Type":"ContainerDied","Data":"9cb8d2023e76708363076e4318c53021435c699f9ff3c4808c54f56fcbc04143"} Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.973229 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a4755e9-435f-4abd-b10f-1f55f1bc4d17","Type":"ContainerStarted","Data":"c5679fd22c04640ffade78a3d738cbfa0acb864d4cc849a2f208da1233ba2e7c"} Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.975823 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l4zph" event={"ID":"c1e01a1b-9baa-4738-8e44-b206863b4d3d","Type":"ContainerStarted","Data":"683cea6dbac3c4542ccaa018d544e4503d89d92644a54f04ea0c49345496e558"} Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.977494 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dca41fc9-68e2-4f42-88fe-942695deca13","Type":"ContainerStarted","Data":"a722e297fadb3b9e50950976a0fa5d006353dd617b04736def88abf4f05e414b"} Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.979334 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c863e0e-b041-4e68-852f-addc7126a215","Type":"ContainerStarted","Data":"ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515"} Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.982029 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"594360f1-b2e1-4e64-82d5-f6e471cd6850","Type":"ContainerStarted","Data":"450e02498f0377b96ab94dadfc19bbe7f9d78998d50219d62ef9de96076765a3"} Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.984456 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d3d1b63f-c619-4973-bad8-c90d12ccbbe1","Type":"ContainerStarted","Data":"febabaacc2f3f8d58182d0800c1cb0af040ba9fceb6c5390d5f2b3efb3988dc7"} Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.985492 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jhrvs" event={"ID":"dcc296df-d8ab-4200-ae5f-3ecf58c614b1","Type":"ContainerStarted","Data":"c571d6b49d7cb046f87905c7e26a0a577ba540d5e7fd47d63d746971a718f45e"} Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.986604 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-5f25f" event={"ID":"1bf9e79a-9bd6-408c-bd89-c483ea7cae69","Type":"ContainerDied","Data":"0e7924c91cfe9403a8482b629cc45f2522e134e449e81d8395dba5a450f1b8d8"} Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.986638 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-5f25f" Dec 17 09:22:52 crc kubenswrapper[4935]: I1217 09:22:52.988869 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90598781-7630-4807-9735-eb1cfaba2927","Type":"ContainerStarted","Data":"60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c"} Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.040027 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-config\") pod \"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e\" (UID: \"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e\") " Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.040865 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-dns-svc\") pod \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.041070 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzjg5\" (UniqueName: \"kubernetes.io/projected/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-kube-api-access-mzjg5\") pod \"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e\" (UID: \"1f081489-2ac9-49cb-baf0-a8bd6b6adc3e\") " Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.041119 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27vqd\" (UniqueName: \"kubernetes.io/projected/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-kube-api-access-27vqd\") pod \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.041592 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-config\") pod \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\" (UID: \"1bf9e79a-9bd6-408c-bd89-c483ea7cae69\") " Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.041756 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1bf9e79a-9bd6-408c-bd89-c483ea7cae69" (UID: "1bf9e79a-9bd6-408c-bd89-c483ea7cae69"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.041806 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-config" (OuterVolumeSpecName: "config") pod "1f081489-2ac9-49cb-baf0-a8bd6b6adc3e" (UID: "1f081489-2ac9-49cb-baf0-a8bd6b6adc3e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.042201 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.042220 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.042376 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-config" (OuterVolumeSpecName: "config") pod "1bf9e79a-9bd6-408c-bd89-c483ea7cae69" (UID: "1bf9e79a-9bd6-408c-bd89-c483ea7cae69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.054014 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-kube-api-access-mzjg5" (OuterVolumeSpecName: "kube-api-access-mzjg5") pod "1f081489-2ac9-49cb-baf0-a8bd6b6adc3e" (UID: "1f081489-2ac9-49cb-baf0-a8bd6b6adc3e"). InnerVolumeSpecName "kube-api-access-mzjg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.067628 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-kube-api-access-27vqd" (OuterVolumeSpecName: "kube-api-access-27vqd") pod "1bf9e79a-9bd6-408c-bd89-c483ea7cae69" (UID: "1bf9e79a-9bd6-408c-bd89-c483ea7cae69"). InnerVolumeSpecName "kube-api-access-27vqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.081740 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.144651 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzjg5\" (UniqueName: \"kubernetes.io/projected/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e-kube-api-access-mzjg5\") on node \"crc\" DevicePath \"\"" Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.144695 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27vqd\" (UniqueName: \"kubernetes.io/projected/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-kube-api-access-27vqd\") on node \"crc\" DevicePath \"\"" Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.144711 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf9e79a-9bd6-408c-bd89-c483ea7cae69-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.331716 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-qc6ml"] Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.344246 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-qc6ml"] Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.357490 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5f25f"] Dec 17 09:22:53 crc kubenswrapper[4935]: I1217 09:22:53.363860 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-5f25f"] Dec 17 09:22:54 crc kubenswrapper[4935]: I1217 09:22:54.003386 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a4f8af4b-aed3-46a8-84a0-aeae265a1309","Type":"ContainerStarted","Data":"e51397c52de851339dcaaa6222a9af1c1648a3688913f3c51fe6ab227980e50f"} Dec 17 09:22:55 crc kubenswrapper[4935]: I1217 09:22:55.146838 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf9e79a-9bd6-408c-bd89-c483ea7cae69" path="/var/lib/kubelet/pods/1bf9e79a-9bd6-408c-bd89-c483ea7cae69/volumes" Dec 17 09:22:55 crc kubenswrapper[4935]: I1217 09:22:55.147798 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f081489-2ac9-49cb-baf0-a8bd6b6adc3e" path="/var/lib/kubelet/pods/1f081489-2ac9-49cb-baf0-a8bd6b6adc3e/volumes" Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.050171 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dca41fc9-68e2-4f42-88fe-942695deca13","Type":"ContainerStarted","Data":"2a00e3a12e1ffb7635d86ed737af672cb8dacb50b178bb5249db439dc45cff3f"} Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.053067 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f4019f7f-3fa3-4d4d-976b-b81f43530f0e","Type":"ContainerStarted","Data":"ea49a467fe8524c4d373f72d4301f2d37a0dc24fe29cc8253ab221a79771b70c"} Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.053227 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.054952 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a4f8af4b-aed3-46a8-84a0-aeae265a1309","Type":"ContainerStarted","Data":"7a449ecda48f40e68a5c361619f3d192c85e1f233d75bfbc54ee1177193669bb"} Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.056932 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"594360f1-b2e1-4e64-82d5-f6e471cd6850","Type":"ContainerStarted","Data":"3b263b20c3bedbf6e502357706ad7c16c322eaa85c6721ce7d4944014b304731"} Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.058798 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a4755e9-435f-4abd-b10f-1f55f1bc4d17","Type":"ContainerStarted","Data":"b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756"} Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.058928 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.060510 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d3d1b63f-c619-4973-bad8-c90d12ccbbe1","Type":"ContainerStarted","Data":"45ae7d100e9b5648cc1566801d7d45b7ed34674fb827f3e970a53c7671898916"} Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.062259 4935 generic.go:334] "Generic (PLEG): container finished" podID="dcc296df-d8ab-4200-ae5f-3ecf58c614b1" containerID="e769342ff091512ae7bbe5456906634926044ecb647792f8fd719201ef1cb88e" exitCode=0 Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.062374 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jhrvs" event={"ID":"dcc296df-d8ab-4200-ae5f-3ecf58c614b1","Type":"ContainerDied","Data":"e769342ff091512ae7bbe5456906634926044ecb647792f8fd719201ef1cb88e"} Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.065581 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l4zph" event={"ID":"c1e01a1b-9baa-4738-8e44-b206863b4d3d","Type":"ContainerStarted","Data":"6a479b05d340357991ea1517238bbbd4e7ddbcb5661acae84d5601381575c491"} Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.065976 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-l4zph" Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.074558 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=23.855499856 podStartE2EDuration="31.074532375s" podCreationTimestamp="2025-12-17 09:22:29 +0000 UTC" firstStartedPulling="2025-12-17 09:22:51.903960258 +0000 UTC m=+1091.563801041" lastFinishedPulling="2025-12-17 09:22:59.122992787 +0000 UTC m=+1098.782833560" observedRunningTime="2025-12-17 09:23:00.072756092 +0000 UTC m=+1099.732596855" watchObservedRunningTime="2025-12-17 09:23:00.074532375 +0000 UTC m=+1099.734373148" Dec 17 09:23:00 crc kubenswrapper[4935]: I1217 09:23:00.106053 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-l4zph" podStartSLOduration=19.291044223 podStartE2EDuration="26.106033265s" podCreationTimestamp="2025-12-17 09:22:34 +0000 UTC" firstStartedPulling="2025-12-17 09:22:52.257357438 +0000 UTC m=+1091.917198201" lastFinishedPulling="2025-12-17 09:22:59.07234647 +0000 UTC m=+1098.732187243" observedRunningTime="2025-12-17 09:23:00.101457612 +0000 UTC m=+1099.761298385" watchObservedRunningTime="2025-12-17 09:23:00.106033265 +0000 UTC m=+1099.765874028" Dec 17 09:23:01 crc kubenswrapper[4935]: I1217 09:23:01.121832 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jhrvs" event={"ID":"dcc296df-d8ab-4200-ae5f-3ecf58c614b1","Type":"ContainerStarted","Data":"b7f1706b2bf7667527d0c87dc0f45d827a5982ea7de175630db3c2be0e906774"} Dec 17 09:23:01 crc kubenswrapper[4935]: I1217 09:23:01.155401 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=23.070541518 podStartE2EDuration="30.155382401s" podCreationTimestamp="2025-12-17 09:22:31 +0000 UTC" firstStartedPulling="2025-12-17 09:22:52.007102376 +0000 UTC m=+1091.666943139" lastFinishedPulling="2025-12-17 09:22:59.091943259 +0000 UTC m=+1098.751784022" observedRunningTime="2025-12-17 09:23:00.200943722 +0000 UTC m=+1099.860784495" watchObservedRunningTime="2025-12-17 09:23:01.155382401 +0000 UTC m=+1100.815223154" Dec 17 09:23:02 crc kubenswrapper[4935]: I1217 09:23:02.135533 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jhrvs" event={"ID":"dcc296df-d8ab-4200-ae5f-3ecf58c614b1","Type":"ContainerStarted","Data":"134d2acedfb6146ff1bddb2551028c3ef9636f26fdc03203b6a6051e4b6ecfea"} Dec 17 09:23:02 crc kubenswrapper[4935]: I1217 09:23:02.136103 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:23:02 crc kubenswrapper[4935]: I1217 09:23:02.163835 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-jhrvs" podStartSLOduration=21.51275173 podStartE2EDuration="28.163811808s" podCreationTimestamp="2025-12-17 09:22:34 +0000 UTC" firstStartedPulling="2025-12-17 09:22:52.363378738 +0000 UTC m=+1092.023219501" lastFinishedPulling="2025-12-17 09:22:59.014438826 +0000 UTC m=+1098.674279579" observedRunningTime="2025-12-17 09:23:02.158305994 +0000 UTC m=+1101.818146767" watchObservedRunningTime="2025-12-17 09:23:02.163811808 +0000 UTC m=+1101.823652571" Dec 17 09:23:03 crc kubenswrapper[4935]: I1217 09:23:03.146818 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a4f8af4b-aed3-46a8-84a0-aeae265a1309","Type":"ContainerStarted","Data":"5bb93713fc372d05839b32be4b80ec20e7ef52eaa6d772e0e40ce1e1f753a049"} Dec 17 09:23:03 crc kubenswrapper[4935]: I1217 09:23:03.149534 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"dca41fc9-68e2-4f42-88fe-942695deca13","Type":"ContainerStarted","Data":"2fef65d45c49916f064e2f15aa3d3547e80d406c73b771a101ce709899cf8492"} Dec 17 09:23:03 crc kubenswrapper[4935]: I1217 09:23:03.149735 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:23:03 crc kubenswrapper[4935]: I1217 09:23:03.168825 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=19.559401116 podStartE2EDuration="29.168804542s" podCreationTimestamp="2025-12-17 09:22:34 +0000 UTC" firstStartedPulling="2025-12-17 09:22:53.165966158 +0000 UTC m=+1092.825806921" lastFinishedPulling="2025-12-17 09:23:02.775369584 +0000 UTC m=+1102.435210347" observedRunningTime="2025-12-17 09:23:03.166528466 +0000 UTC m=+1102.826369239" watchObservedRunningTime="2025-12-17 09:23:03.168804542 +0000 UTC m=+1102.828645305" Dec 17 09:23:03 crc kubenswrapper[4935]: I1217 09:23:03.193369 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.960014079 podStartE2EDuration="25.193344152s" podCreationTimestamp="2025-12-17 09:22:38 +0000 UTC" firstStartedPulling="2025-12-17 09:22:52.55759092 +0000 UTC m=+1092.217431683" lastFinishedPulling="2025-12-17 09:23:02.790920993 +0000 UTC m=+1102.450761756" observedRunningTime="2025-12-17 09:23:03.189311943 +0000 UTC m=+1102.849152726" watchObservedRunningTime="2025-12-17 09:23:03.193344152 +0000 UTC m=+1102.853184925" Dec 17 09:23:03 crc kubenswrapper[4935]: I1217 09:23:03.776463 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 17 09:23:03 crc kubenswrapper[4935]: I1217 09:23:03.824748 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.161719 4935 generic.go:334] "Generic (PLEG): container finished" podID="594360f1-b2e1-4e64-82d5-f6e471cd6850" containerID="3b263b20c3bedbf6e502357706ad7c16c322eaa85c6721ce7d4944014b304731" exitCode=0 Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.161843 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"594360f1-b2e1-4e64-82d5-f6e471cd6850","Type":"ContainerDied","Data":"3b263b20c3bedbf6e502357706ad7c16c322eaa85c6721ce7d4944014b304731"} Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.165831 4935 generic.go:334] "Generic (PLEG): container finished" podID="d3d1b63f-c619-4973-bad8-c90d12ccbbe1" containerID="45ae7d100e9b5648cc1566801d7d45b7ed34674fb827f3e970a53c7671898916" exitCode=0 Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.166054 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d3d1b63f-c619-4973-bad8-c90d12ccbbe1","Type":"ContainerDied","Data":"45ae7d100e9b5648cc1566801d7d45b7ed34674fb827f3e970a53c7671898916"} Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.166436 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.219663 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.502826 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-2f4wh"] Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.553345 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-48lfw"] Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.554734 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.559662 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.612025 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-794868bd45-6tqh7"] Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.613686 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.616665 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.623820 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-48lfw"] Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.640260 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-6tqh7"] Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.682210 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7aab2b87-c484-40c4-a3c5-652f874476b2-ovs-rundir\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.682781 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd6l7\" (UniqueName: \"kubernetes.io/projected/7aab2b87-c484-40c4-a3c5-652f874476b2-kube-api-access-fd6l7\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.682877 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-config\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.682921 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-dns-svc\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.682943 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aab2b87-c484-40c4-a3c5-652f874476b2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.682967 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aab2b87-c484-40c4-a3c5-652f874476b2-combined-ca-bundle\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.682990 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.683029 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfgf5\" (UniqueName: \"kubernetes.io/projected/df393918-e2ab-4597-b1c6-acdc7034a6d3-kube-api-access-tfgf5\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.683114 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7aab2b87-c484-40c4-a3c5-652f874476b2-config\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.683134 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7aab2b87-c484-40c4-a3c5-652f874476b2-ovn-rundir\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.785364 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-config\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.785428 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-dns-svc\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.785448 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aab2b87-c484-40c4-a3c5-652f874476b2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.785469 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aab2b87-c484-40c4-a3c5-652f874476b2-combined-ca-bundle\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.785490 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.785519 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfgf5\" (UniqueName: \"kubernetes.io/projected/df393918-e2ab-4597-b1c6-acdc7034a6d3-kube-api-access-tfgf5\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.785546 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7aab2b87-c484-40c4-a3c5-652f874476b2-config\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.785571 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7aab2b87-c484-40c4-a3c5-652f874476b2-ovn-rundir\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.785597 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7aab2b87-c484-40c4-a3c5-652f874476b2-ovs-rundir\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.785626 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd6l7\" (UniqueName: \"kubernetes.io/projected/7aab2b87-c484-40c4-a3c5-652f874476b2-kube-api-access-fd6l7\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.797565 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7aab2b87-c484-40c4-a3c5-652f874476b2-ovn-rundir\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.798196 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7aab2b87-c484-40c4-a3c5-652f874476b2-ovs-rundir\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.798295 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-dns-svc\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.798410 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.799025 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7aab2b87-c484-40c4-a3c5-652f874476b2-config\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.799126 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-config\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.809336 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aab2b87-c484-40c4-a3c5-652f874476b2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.811389 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aab2b87-c484-40c4-a3c5-652f874476b2-combined-ca-bundle\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.829921 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd6l7\" (UniqueName: \"kubernetes.io/projected/7aab2b87-c484-40c4-a3c5-652f874476b2-kube-api-access-fd6l7\") pod \"ovn-controller-metrics-48lfw\" (UID: \"7aab2b87-c484-40c4-a3c5-652f874476b2\") " pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.831472 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfgf5\" (UniqueName: \"kubernetes.io/projected/df393918-e2ab-4597-b1c6-acdc7034a6d3-kube-api-access-tfgf5\") pod \"dnsmasq-dns-794868bd45-6tqh7\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.906213 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-48lfw" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.907798 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-v92mg"] Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.949580 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-f4l7g"] Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.951307 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.951787 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.952451 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.955776 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.956367 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-f4l7g"] Dec 17 09:23:04 crc kubenswrapper[4935]: I1217 09:23:04.957878 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.006920 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-config\") pod \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.007431 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-dns-svc\") pod \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.007704 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzj5m\" (UniqueName: \"kubernetes.io/projected/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-kube-api-access-tzj5m\") pod \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\" (UID: \"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53\") " Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.007982 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-config" (OuterVolumeSpecName: "config") pod "f25c22f0-aa41-47d9-ab2b-e3cdc9394f53" (UID: "f25c22f0-aa41-47d9-ab2b-e3cdc9394f53"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.008238 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8bhl\" (UniqueName: \"kubernetes.io/projected/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-kube-api-access-h8bhl\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.008367 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.008604 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.008678 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-config\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.008697 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.008820 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.009818 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f25c22f0-aa41-47d9-ab2b-e3cdc9394f53" (UID: "f25c22f0-aa41-47d9-ab2b-e3cdc9394f53"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.016647 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-kube-api-access-tzj5m" (OuterVolumeSpecName: "kube-api-access-tzj5m") pod "f25c22f0-aa41-47d9-ab2b-e3cdc9394f53" (UID: "f25c22f0-aa41-47d9-ab2b-e3cdc9394f53"). InnerVolumeSpecName "kube-api-access-tzj5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.112495 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.112575 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-config\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.112603 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.112746 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bhl\" (UniqueName: \"kubernetes.io/projected/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-kube-api-access-h8bhl\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.112785 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.112867 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzj5m\" (UniqueName: \"kubernetes.io/projected/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-kube-api-access-tzj5m\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.112890 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.115459 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.115841 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-config\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.122487 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.122932 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.160206 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8bhl\" (UniqueName: \"kubernetes.io/projected/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-kube-api-access-h8bhl\") pod \"dnsmasq-dns-757dc6fff9-f4l7g\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.197444 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" event={"ID":"f25c22f0-aa41-47d9-ab2b-e3cdc9394f53","Type":"ContainerDied","Data":"a5d127977ad0f469f05de017b842b8727db5327761a725bab9a508705a0835ab"} Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.197689 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-2f4wh" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.213738 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"594360f1-b2e1-4e64-82d5-f6e471cd6850","Type":"ContainerStarted","Data":"1a67c2afb3681f364555c4728e1858f36e1eafbaa2a10ceb89d83e7f6b1f8296"} Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.227152 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"d3d1b63f-c619-4973-bad8-c90d12ccbbe1","Type":"ContainerStarted","Data":"90052c41066f29be2feb1e0f0e0a01b1f38d81740325e39c47bc32d415f77e69"} Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.260352 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.619329686 podStartE2EDuration="37.26033186s" podCreationTimestamp="2025-12-17 09:22:28 +0000 UTC" firstStartedPulling="2025-12-17 09:22:52.45768938 +0000 UTC m=+1092.117530143" lastFinishedPulling="2025-12-17 09:22:59.098691554 +0000 UTC m=+1098.758532317" observedRunningTime="2025-12-17 09:23:05.249176168 +0000 UTC m=+1104.909016931" watchObservedRunningTime="2025-12-17 09:23:05.26033186 +0000 UTC m=+1104.920172623" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.276845 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=32.571374236 podStartE2EDuration="39.276821203s" podCreationTimestamp="2025-12-17 09:22:26 +0000 UTC" firstStartedPulling="2025-12-17 09:22:52.366126695 +0000 UTC m=+1092.025967458" lastFinishedPulling="2025-12-17 09:22:59.071573622 +0000 UTC m=+1098.731414425" observedRunningTime="2025-12-17 09:23:05.274878155 +0000 UTC m=+1104.934718918" watchObservedRunningTime="2025-12-17 09:23:05.276821203 +0000 UTC m=+1104.936661966" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.284989 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.294719 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.341341 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-2f4wh"] Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.360183 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-2f4wh"] Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.422466 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-dns-svc\") pod \"f10b4928-6f26-4d19-af69-7a924b448297\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.422971 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgrpv\" (UniqueName: \"kubernetes.io/projected/f10b4928-6f26-4d19-af69-7a924b448297-kube-api-access-sgrpv\") pod \"f10b4928-6f26-4d19-af69-7a924b448297\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.423047 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-config\") pod \"f10b4928-6f26-4d19-af69-7a924b448297\" (UID: \"f10b4928-6f26-4d19-af69-7a924b448297\") " Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.424291 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f10b4928-6f26-4d19-af69-7a924b448297" (UID: "f10b4928-6f26-4d19-af69-7a924b448297"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.424550 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-config" (OuterVolumeSpecName: "config") pod "f10b4928-6f26-4d19-af69-7a924b448297" (UID: "f10b4928-6f26-4d19-af69-7a924b448297"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.428970 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10b4928-6f26-4d19-af69-7a924b448297-kube-api-access-sgrpv" (OuterVolumeSpecName: "kube-api-access-sgrpv") pod "f10b4928-6f26-4d19-af69-7a924b448297" (UID: "f10b4928-6f26-4d19-af69-7a924b448297"). InnerVolumeSpecName "kube-api-access-sgrpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.513715 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-6tqh7"] Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.528128 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.528935 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgrpv\" (UniqueName: \"kubernetes.io/projected/f10b4928-6f26-4d19-af69-7a924b448297-kube-api-access-sgrpv\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.528992 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f10b4928-6f26-4d19-af69-7a924b448297-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.537162 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-48lfw"] Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.783150 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-f4l7g"] Dec 17 09:23:05 crc kubenswrapper[4935]: W1217 09:23:05.787892 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddde6307f_9e87_4ae7_a986_a5bdf9b1d59f.slice/crio-6666570f61882b0e9bfb75fa2ce8dbaa9de967a03bcbef591488ba1a0c52338d WatchSource:0}: Error finding container 6666570f61882b0e9bfb75fa2ce8dbaa9de967a03bcbef591488ba1a0c52338d: Status 404 returned error can't find the container with id 6666570f61882b0e9bfb75fa2ce8dbaa9de967a03bcbef591488ba1a0c52338d Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.942812 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.942893 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 17 09:23:05 crc kubenswrapper[4935]: I1217 09:23:05.983926 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.234932 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-48lfw" event={"ID":"7aab2b87-c484-40c4-a3c5-652f874476b2","Type":"ContainerStarted","Data":"582ce0805ecb4e1cdad9a37d1228cbccce9146a0630a4f11d35fcc9e2e46a7a7"} Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.235545 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-48lfw" event={"ID":"7aab2b87-c484-40c4-a3c5-652f874476b2","Type":"ContainerStarted","Data":"ac91921371ca517e2a01f78636bb66c8ea8fc71d180c55a783b78ae1c8d97d86"} Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.236849 4935 generic.go:334] "Generic (PLEG): container finished" podID="df393918-e2ab-4597-b1c6-acdc7034a6d3" containerID="fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467" exitCode=0 Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.236998 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" event={"ID":"df393918-e2ab-4597-b1c6-acdc7034a6d3","Type":"ContainerDied","Data":"fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467"} Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.237040 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" event={"ID":"df393918-e2ab-4597-b1c6-acdc7034a6d3","Type":"ContainerStarted","Data":"e80195584d704e9e7b84c1b8eab35f446c63fadea1a9540c0f17def8e6b13808"} Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.241487 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" event={"ID":"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f","Type":"ContainerStarted","Data":"6666570f61882b0e9bfb75fa2ce8dbaa9de967a03bcbef591488ba1a0c52338d"} Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.243723 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-v92mg" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.243718 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-v92mg" event={"ID":"f10b4928-6f26-4d19-af69-7a924b448297","Type":"ContainerDied","Data":"8904d9861040c02456e4c850abfb2cb697bfeaf0ffffa773414300afca41c651"} Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.261793 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-48lfw" podStartSLOduration=2.261766177 podStartE2EDuration="2.261766177s" podCreationTimestamp="2025-12-17 09:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:23:06.252655215 +0000 UTC m=+1105.912495978" watchObservedRunningTime="2025-12-17 09:23:06.261766177 +0000 UTC m=+1105.921606940" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.363776 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.492457 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-v92mg"] Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.500241 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-v92mg"] Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.814664 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.816666 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.820076 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.821419 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gnhgf" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.825299 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.826080 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.832537 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.876309 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d36637f4-f52a-47af-8f2d-439f62b55b8d-scripts\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.876384 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36637f4-f52a-47af-8f2d-439f62b55b8d-config\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.876423 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d36637f4-f52a-47af-8f2d-439f62b55b8d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.876445 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36637f4-f52a-47af-8f2d-439f62b55b8d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.876475 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d36637f4-f52a-47af-8f2d-439f62b55b8d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.876523 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d36637f4-f52a-47af-8f2d-439f62b55b8d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.876580 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5frbw\" (UniqueName: \"kubernetes.io/projected/d36637f4-f52a-47af-8f2d-439f62b55b8d-kube-api-access-5frbw\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.977880 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5frbw\" (UniqueName: \"kubernetes.io/projected/d36637f4-f52a-47af-8f2d-439f62b55b8d-kube-api-access-5frbw\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.977984 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d36637f4-f52a-47af-8f2d-439f62b55b8d-scripts\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.978014 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36637f4-f52a-47af-8f2d-439f62b55b8d-config\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.978042 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d36637f4-f52a-47af-8f2d-439f62b55b8d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.978069 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36637f4-f52a-47af-8f2d-439f62b55b8d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.978096 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d36637f4-f52a-47af-8f2d-439f62b55b8d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.978328 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d36637f4-f52a-47af-8f2d-439f62b55b8d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.978814 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d36637f4-f52a-47af-8f2d-439f62b55b8d-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.979414 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d36637f4-f52a-47af-8f2d-439f62b55b8d-config\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.980106 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d36637f4-f52a-47af-8f2d-439f62b55b8d-scripts\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.984055 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d36637f4-f52a-47af-8f2d-439f62b55b8d-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.984381 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d36637f4-f52a-47af-8f2d-439f62b55b8d-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.984624 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d36637f4-f52a-47af-8f2d-439f62b55b8d-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:06 crc kubenswrapper[4935]: I1217 09:23:06.996187 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5frbw\" (UniqueName: \"kubernetes.io/projected/d36637f4-f52a-47af-8f2d-439f62b55b8d-kube-api-access-5frbw\") pod \"ovn-northd-0\" (UID: \"d36637f4-f52a-47af-8f2d-439f62b55b8d\") " pod="openstack/ovn-northd-0" Dec 17 09:23:07 crc kubenswrapper[4935]: I1217 09:23:07.135393 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10b4928-6f26-4d19-af69-7a924b448297" path="/var/lib/kubelet/pods/f10b4928-6f26-4d19-af69-7a924b448297/volumes" Dec 17 09:23:07 crc kubenswrapper[4935]: I1217 09:23:07.135802 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f25c22f0-aa41-47d9-ab2b-e3cdc9394f53" path="/var/lib/kubelet/pods/f25c22f0-aa41-47d9-ab2b-e3cdc9394f53/volumes" Dec 17 09:23:07 crc kubenswrapper[4935]: I1217 09:23:07.145855 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 17 09:23:07 crc kubenswrapper[4935]: I1217 09:23:07.263359 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" event={"ID":"df393918-e2ab-4597-b1c6-acdc7034a6d3","Type":"ContainerStarted","Data":"7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90"} Dec 17 09:23:07 crc kubenswrapper[4935]: I1217 09:23:07.264896 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:07 crc kubenswrapper[4935]: I1217 09:23:07.274018 4935 generic.go:334] "Generic (PLEG): container finished" podID="dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" containerID="5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660" exitCode=0 Dec 17 09:23:07 crc kubenswrapper[4935]: I1217 09:23:07.274558 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" event={"ID":"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f","Type":"ContainerDied","Data":"5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660"} Dec 17 09:23:07 crc kubenswrapper[4935]: I1217 09:23:07.297717 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" podStartSLOduration=2.810491667 podStartE2EDuration="3.297677425s" podCreationTimestamp="2025-12-17 09:23:04 +0000 UTC" firstStartedPulling="2025-12-17 09:23:05.538639847 +0000 UTC m=+1105.198480610" lastFinishedPulling="2025-12-17 09:23:06.025825605 +0000 UTC m=+1105.685666368" observedRunningTime="2025-12-17 09:23:07.289124377 +0000 UTC m=+1106.948965190" watchObservedRunningTime="2025-12-17 09:23:07.297677425 +0000 UTC m=+1106.957518188" Dec 17 09:23:07 crc kubenswrapper[4935]: I1217 09:23:07.639014 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 17 09:23:08 crc kubenswrapper[4935]: I1217 09:23:08.284606 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d36637f4-f52a-47af-8f2d-439f62b55b8d","Type":"ContainerStarted","Data":"2965adabb287aeb8762122c3a7f2df2145387db9feeaddc9c8ec884409040a37"} Dec 17 09:23:08 crc kubenswrapper[4935]: I1217 09:23:08.621773 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 17 09:23:08 crc kubenswrapper[4935]: I1217 09:23:08.621857 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 17 09:23:09 crc kubenswrapper[4935]: I1217 09:23:09.687523 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 17 09:23:09 crc kubenswrapper[4935]: I1217 09:23:09.687630 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.018013 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.093366 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-f4l7g"] Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.140836 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-96f46"] Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.155022 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.178725 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-96f46"] Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.241309 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.241385 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-config\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.241456 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.241492 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85j82\" (UniqueName: \"kubernetes.io/projected/582e42e7-25fe-4a3e-9a76-87f9c62af78c-kube-api-access-85j82\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.241610 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.322892 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" event={"ID":"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f","Type":"ContainerStarted","Data":"80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02"} Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.324686 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.343094 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.343172 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-config\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.343207 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.343248 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85j82\" (UniqueName: \"kubernetes.io/projected/582e42e7-25fe-4a3e-9a76-87f9c62af78c-kube-api-access-85j82\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.343339 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.344672 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.345628 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-config\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.346156 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.347416 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.355209 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" podStartSLOduration=7.780472233 podStartE2EDuration="8.355176687s" podCreationTimestamp="2025-12-17 09:23:04 +0000 UTC" firstStartedPulling="2025-12-17 09:23:05.792582649 +0000 UTC m=+1105.452423412" lastFinishedPulling="2025-12-17 09:23:06.367287103 +0000 UTC m=+1106.027127866" observedRunningTime="2025-12-17 09:23:12.353703261 +0000 UTC m=+1112.013544064" watchObservedRunningTime="2025-12-17 09:23:12.355176687 +0000 UTC m=+1112.015017450" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.409880 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85j82\" (UniqueName: \"kubernetes.io/projected/582e42e7-25fe-4a3e-9a76-87f9c62af78c-kube-api-access-85j82\") pod \"dnsmasq-dns-6cb545bd4c-96f46\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.492183 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.770584 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 17 09:23:12 crc kubenswrapper[4935]: I1217 09:23:12.898932 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.009406 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-96f46"] Dec 17 09:23:13 crc kubenswrapper[4935]: W1217 09:23:13.073835 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod582e42e7_25fe_4a3e_9a76_87f9c62af78c.slice/crio-0ac329e8458184dc3d8cda0e5abe6f7ba22f23cd38dddcff996dbfa444691319 WatchSource:0}: Error finding container 0ac329e8458184dc3d8cda0e5abe6f7ba22f23cd38dddcff996dbfa444691319: Status 404 returned error can't find the container with id 0ac329e8458184dc3d8cda0e5abe6f7ba22f23cd38dddcff996dbfa444691319 Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.234512 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.241532 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.251960 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.252081 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.252421 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-qq6xg" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.252847 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.261427 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.347611 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" event={"ID":"582e42e7-25fe-4a3e-9a76-87f9c62af78c","Type":"ContainerStarted","Data":"0ac329e8458184dc3d8cda0e5abe6f7ba22f23cd38dddcff996dbfa444691319"} Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.348207 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" podUID="dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" containerName="dnsmasq-dns" containerID="cri-o://80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02" gracePeriod=10 Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.365531 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-cache\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.365622 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.365696 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-lock\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.367294 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hc6x\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-kube-api-access-5hc6x\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.367340 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.469162 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-cache\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.469263 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.469330 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-lock\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.469403 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hc6x\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-kube-api-access-5hc6x\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.469434 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.470466 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.470681 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-lock\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: E1217 09:23:13.470708 4935 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 17 09:23:13 crc kubenswrapper[4935]: E1217 09:23:13.470729 4935 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.470771 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-cache\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: E1217 09:23:13.470788 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift podName:a2c97a57-ac68-4fab-acbd-ecdec8db5fb5 nodeName:}" failed. No retries permitted until 2025-12-17 09:23:13.970762312 +0000 UTC m=+1113.630603295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift") pod "swift-storage-0" (UID: "a2c97a57-ac68-4fab-acbd-ecdec8db5fb5") : configmap "swift-ring-files" not found Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.499731 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hc6x\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-kube-api-access-5hc6x\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.513513 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.800822 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.848696 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-jdzl5"] Dec 17 09:23:13 crc kubenswrapper[4935]: E1217 09:23:13.849090 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" containerName="init" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.849111 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" containerName="init" Dec 17 09:23:13 crc kubenswrapper[4935]: E1217 09:23:13.849165 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" containerName="dnsmasq-dns" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.849172 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" containerName="dnsmasq-dns" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.849372 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" containerName="dnsmasq-dns" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.850027 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.854882 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.854878 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.854878 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.871155 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jdzl5"] Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.980791 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-sb\") pod \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.980906 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-config\") pod \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.980956 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-dns-svc\") pod \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.980997 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8bhl\" (UniqueName: \"kubernetes.io/projected/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-kube-api-access-h8bhl\") pod \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.981064 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-nb\") pod \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\" (UID: \"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f\") " Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.981315 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-etc-swift\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.981378 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.981397 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-dispersionconf\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.981413 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-ring-data-devices\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.981503 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-scripts\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.981533 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-combined-ca-bundle\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.981553 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk8p9\" (UniqueName: \"kubernetes.io/projected/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-kube-api-access-hk8p9\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.981573 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-swiftconf\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:13 crc kubenswrapper[4935]: E1217 09:23:13.981985 4935 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 17 09:23:13 crc kubenswrapper[4935]: E1217 09:23:13.982021 4935 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 17 09:23:13 crc kubenswrapper[4935]: E1217 09:23:13.982099 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift podName:a2c97a57-ac68-4fab-acbd-ecdec8db5fb5 nodeName:}" failed. No retries permitted until 2025-12-17 09:23:14.982070419 +0000 UTC m=+1114.641911182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift") pod "swift-storage-0" (UID: "a2c97a57-ac68-4fab-acbd-ecdec8db5fb5") : configmap "swift-ring-files" not found Dec 17 09:23:13 crc kubenswrapper[4935]: I1217 09:23:13.988550 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-kube-api-access-h8bhl" (OuterVolumeSpecName: "kube-api-access-h8bhl") pod "dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" (UID: "dde6307f-9e87-4ae7-a986-a5bdf9b1d59f"). InnerVolumeSpecName "kube-api-access-h8bhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.029168 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-config" (OuterVolumeSpecName: "config") pod "dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" (UID: "dde6307f-9e87-4ae7-a986-a5bdf9b1d59f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.030925 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" (UID: "dde6307f-9e87-4ae7-a986-a5bdf9b1d59f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.037692 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" (UID: "dde6307f-9e87-4ae7-a986-a5bdf9b1d59f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.040019 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" (UID: "dde6307f-9e87-4ae7-a986-a5bdf9b1d59f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.083606 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-scripts\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.084348 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-combined-ca-bundle\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.084421 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk8p9\" (UniqueName: \"kubernetes.io/projected/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-kube-api-access-hk8p9\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.084451 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-swiftconf\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.084612 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-etc-swift\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.084726 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-dispersionconf\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.084760 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-ring-data-devices\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.084947 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.084965 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.084977 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.084991 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8bhl\" (UniqueName: \"kubernetes.io/projected/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-kube-api-access-h8bhl\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.085003 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.086006 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-ring-data-devices\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.086023 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-etc-swift\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.089389 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-combined-ca-bundle\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.089421 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-scripts\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.089796 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-swiftconf\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.091379 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-dispersionconf\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.111169 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk8p9\" (UniqueName: \"kubernetes.io/projected/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-kube-api-access-hk8p9\") pod \"swift-ring-rebalance-jdzl5\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.178818 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.362783 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d36637f4-f52a-47af-8f2d-439f62b55b8d","Type":"ContainerStarted","Data":"11c405b22bd942175ffc5eaab08724351b7956125a918f3c6246c1afd8b9ebf1"} Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.362999 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d36637f4-f52a-47af-8f2d-439f62b55b8d","Type":"ContainerStarted","Data":"1794d058831da4e3c70f4ab5699a1758fc59f5df61e98b0b3d19d43d8035a72c"} Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.363489 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.370604 4935 generic.go:334] "Generic (PLEG): container finished" podID="dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" containerID="80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02" exitCode=0 Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.370875 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.371532 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" event={"ID":"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f","Type":"ContainerDied","Data":"80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02"} Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.371601 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-f4l7g" event={"ID":"dde6307f-9e87-4ae7-a986-a5bdf9b1d59f","Type":"ContainerDied","Data":"6666570f61882b0e9bfb75fa2ce8dbaa9de967a03bcbef591488ba1a0c52338d"} Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.371628 4935 scope.go:117] "RemoveContainer" containerID="80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.373850 4935 generic.go:334] "Generic (PLEG): container finished" podID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerID="d76653a5dc31a7a0d5987a3874238d7ac5ac68675c001840510d3fd2fe2d7a96" exitCode=0 Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.373900 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" event={"ID":"582e42e7-25fe-4a3e-9a76-87f9c62af78c","Type":"ContainerDied","Data":"d76653a5dc31a7a0d5987a3874238d7ac5ac68675c001840510d3fd2fe2d7a96"} Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.417151 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.939722167 podStartE2EDuration="8.417116993s" podCreationTimestamp="2025-12-17 09:23:06 +0000 UTC" firstStartedPulling="2025-12-17 09:23:07.642352163 +0000 UTC m=+1107.302192926" lastFinishedPulling="2025-12-17 09:23:13.119746989 +0000 UTC m=+1112.779587752" observedRunningTime="2025-12-17 09:23:14.399749239 +0000 UTC m=+1114.059590002" watchObservedRunningTime="2025-12-17 09:23:14.417116993 +0000 UTC m=+1114.076957756" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.439324 4935 scope.go:117] "RemoveContainer" containerID="5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.557223 4935 scope.go:117] "RemoveContainer" containerID="80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02" Dec 17 09:23:14 crc kubenswrapper[4935]: E1217 09:23:14.557968 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02\": container with ID starting with 80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02 not found: ID does not exist" containerID="80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.558024 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02"} err="failed to get container status \"80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02\": rpc error: code = NotFound desc = could not find container \"80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02\": container with ID starting with 80fdf6d6c85bd5a8d7c8569986ea357780b4a5e84729df4af8027fcfdc109e02 not found: ID does not exist" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.558057 4935 scope.go:117] "RemoveContainer" containerID="5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660" Dec 17 09:23:14 crc kubenswrapper[4935]: E1217 09:23:14.558455 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660\": container with ID starting with 5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660 not found: ID does not exist" containerID="5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.558584 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660"} err="failed to get container status \"5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660\": rpc error: code = NotFound desc = could not find container \"5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660\": container with ID starting with 5b27f90c018b58920e162275e2ba61032ae83d93e1086555d675a61174582660 not found: ID does not exist" Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.633334 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-f4l7g"] Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.642499 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-f4l7g"] Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.669989 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-jdzl5"] Dec 17 09:23:14 crc kubenswrapper[4935]: W1217 09:23:14.677012 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod499a79fe_0a9a_4a2c_98b1_1e0c99c2bdc7.slice/crio-d16f7fdc2205c93e1649ad0288c7508ec3323992797156298d479f4174da7602 WatchSource:0}: Error finding container d16f7fdc2205c93e1649ad0288c7508ec3323992797156298d479f4174da7602: Status 404 returned error can't find the container with id d16f7fdc2205c93e1649ad0288c7508ec3323992797156298d479f4174da7602 Dec 17 09:23:14 crc kubenswrapper[4935]: I1217 09:23:14.954467 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.014771 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:15 crc kubenswrapper[4935]: E1217 09:23:15.015071 4935 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 17 09:23:15 crc kubenswrapper[4935]: E1217 09:23:15.015093 4935 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 17 09:23:15 crc kubenswrapper[4935]: E1217 09:23:15.015182 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift podName:a2c97a57-ac68-4fab-acbd-ecdec8db5fb5 nodeName:}" failed. No retries permitted until 2025-12-17 09:23:17.015147148 +0000 UTC m=+1116.674987921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift") pod "swift-storage-0" (UID: "a2c97a57-ac68-4fab-acbd-ecdec8db5fb5") : configmap "swift-ring-files" not found Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.135860 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde6307f-9e87-4ae7-a986-a5bdf9b1d59f" path="/var/lib/kubelet/pods/dde6307f-9e87-4ae7-a986-a5bdf9b1d59f/volumes" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.277969 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-5d06-account-create-update-wwsz6"] Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.279206 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d06-account-create-update-wwsz6" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.281500 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.291155 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-2dr9n"] Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.294528 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2dr9n" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.305019 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5d06-account-create-update-wwsz6"] Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.335210 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2dr9n"] Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.384253 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jdzl5" event={"ID":"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7","Type":"ContainerStarted","Data":"d16f7fdc2205c93e1649ad0288c7508ec3323992797156298d479f4174da7602"} Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.388658 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" event={"ID":"582e42e7-25fe-4a3e-9a76-87f9c62af78c","Type":"ContainerStarted","Data":"ae969a21e12d842efa534d2ea406cd7a71b99213044780025d473d6e16ccf39f"} Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.388752 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.415865 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" podStartSLOduration=3.415841453 podStartE2EDuration="3.415841453s" podCreationTimestamp="2025-12-17 09:23:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:23:15.411115308 +0000 UTC m=+1115.070956081" watchObservedRunningTime="2025-12-17 09:23:15.415841453 +0000 UTC m=+1115.075682216" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.422816 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd983b91-5108-4717-a2d2-4324cbd041eb-operator-scripts\") pod \"glance-5d06-account-create-update-wwsz6\" (UID: \"cd983b91-5108-4717-a2d2-4324cbd041eb\") " pod="openstack/glance-5d06-account-create-update-wwsz6" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.424379 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc6g5\" (UniqueName: \"kubernetes.io/projected/e9d6a1ba-bf06-4798-87c6-8980f387fe14-kube-api-access-vc6g5\") pod \"glance-db-create-2dr9n\" (UID: \"e9d6a1ba-bf06-4798-87c6-8980f387fe14\") " pod="openstack/glance-db-create-2dr9n" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.424483 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d6a1ba-bf06-4798-87c6-8980f387fe14-operator-scripts\") pod \"glance-db-create-2dr9n\" (UID: \"e9d6a1ba-bf06-4798-87c6-8980f387fe14\") " pod="openstack/glance-db-create-2dr9n" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.424690 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52b56\" (UniqueName: \"kubernetes.io/projected/cd983b91-5108-4717-a2d2-4324cbd041eb-kube-api-access-52b56\") pod \"glance-5d06-account-create-update-wwsz6\" (UID: \"cd983b91-5108-4717-a2d2-4324cbd041eb\") " pod="openstack/glance-5d06-account-create-update-wwsz6" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.526383 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52b56\" (UniqueName: \"kubernetes.io/projected/cd983b91-5108-4717-a2d2-4324cbd041eb-kube-api-access-52b56\") pod \"glance-5d06-account-create-update-wwsz6\" (UID: \"cd983b91-5108-4717-a2d2-4324cbd041eb\") " pod="openstack/glance-5d06-account-create-update-wwsz6" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.526566 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd983b91-5108-4717-a2d2-4324cbd041eb-operator-scripts\") pod \"glance-5d06-account-create-update-wwsz6\" (UID: \"cd983b91-5108-4717-a2d2-4324cbd041eb\") " pod="openstack/glance-5d06-account-create-update-wwsz6" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.526636 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc6g5\" (UniqueName: \"kubernetes.io/projected/e9d6a1ba-bf06-4798-87c6-8980f387fe14-kube-api-access-vc6g5\") pod \"glance-db-create-2dr9n\" (UID: \"e9d6a1ba-bf06-4798-87c6-8980f387fe14\") " pod="openstack/glance-db-create-2dr9n" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.526686 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d6a1ba-bf06-4798-87c6-8980f387fe14-operator-scripts\") pod \"glance-db-create-2dr9n\" (UID: \"e9d6a1ba-bf06-4798-87c6-8980f387fe14\") " pod="openstack/glance-db-create-2dr9n" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.527614 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd983b91-5108-4717-a2d2-4324cbd041eb-operator-scripts\") pod \"glance-5d06-account-create-update-wwsz6\" (UID: \"cd983b91-5108-4717-a2d2-4324cbd041eb\") " pod="openstack/glance-5d06-account-create-update-wwsz6" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.527664 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d6a1ba-bf06-4798-87c6-8980f387fe14-operator-scripts\") pod \"glance-db-create-2dr9n\" (UID: \"e9d6a1ba-bf06-4798-87c6-8980f387fe14\") " pod="openstack/glance-db-create-2dr9n" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.552194 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc6g5\" (UniqueName: \"kubernetes.io/projected/e9d6a1ba-bf06-4798-87c6-8980f387fe14-kube-api-access-vc6g5\") pod \"glance-db-create-2dr9n\" (UID: \"e9d6a1ba-bf06-4798-87c6-8980f387fe14\") " pod="openstack/glance-db-create-2dr9n" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.552944 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52b56\" (UniqueName: \"kubernetes.io/projected/cd983b91-5108-4717-a2d2-4324cbd041eb-kube-api-access-52b56\") pod \"glance-5d06-account-create-update-wwsz6\" (UID: \"cd983b91-5108-4717-a2d2-4324cbd041eb\") " pod="openstack/glance-5d06-account-create-update-wwsz6" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.598603 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d06-account-create-update-wwsz6" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.615388 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2dr9n" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.824804 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 17 09:23:15 crc kubenswrapper[4935]: I1217 09:23:15.931522 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 17 09:23:16 crc kubenswrapper[4935]: I1217 09:23:16.153443 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-5d06-account-create-update-wwsz6"] Dec 17 09:23:16 crc kubenswrapper[4935]: I1217 09:23:16.266887 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-2dr9n"] Dec 17 09:23:16 crc kubenswrapper[4935]: I1217 09:23:16.397004 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2dr9n" event={"ID":"e9d6a1ba-bf06-4798-87c6-8980f387fe14","Type":"ContainerStarted","Data":"03df7effdf95f56343d23ad9cbc645f1797c1fe829a0dd2b28f268e059508a72"} Dec 17 09:23:16 crc kubenswrapper[4935]: I1217 09:23:16.400923 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5d06-account-create-update-wwsz6" event={"ID":"cd983b91-5108-4717-a2d2-4324cbd041eb","Type":"ContainerStarted","Data":"f0034f50d76085dc4924fe6ca4eaa5663036d438b11cffdd907f4b5105d4f892"} Dec 17 09:23:17 crc kubenswrapper[4935]: I1217 09:23:17.060799 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:17 crc kubenswrapper[4935]: E1217 09:23:17.061036 4935 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 17 09:23:17 crc kubenswrapper[4935]: E1217 09:23:17.061401 4935 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 17 09:23:17 crc kubenswrapper[4935]: E1217 09:23:17.061468 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift podName:a2c97a57-ac68-4fab-acbd-ecdec8db5fb5 nodeName:}" failed. No retries permitted until 2025-12-17 09:23:21.061449031 +0000 UTC m=+1120.721289794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift") pod "swift-storage-0" (UID: "a2c97a57-ac68-4fab-acbd-ecdec8db5fb5") : configmap "swift-ring-files" not found Dec 17 09:23:17 crc kubenswrapper[4935]: I1217 09:23:17.413507 4935 generic.go:334] "Generic (PLEG): container finished" podID="cd983b91-5108-4717-a2d2-4324cbd041eb" containerID="094c87efb5a078eb16f0e7e48a0388b82596e884343dba5137e653e3f7010fa4" exitCode=0 Dec 17 09:23:17 crc kubenswrapper[4935]: I1217 09:23:17.413626 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5d06-account-create-update-wwsz6" event={"ID":"cd983b91-5108-4717-a2d2-4324cbd041eb","Type":"ContainerDied","Data":"094c87efb5a078eb16f0e7e48a0388b82596e884343dba5137e653e3f7010fa4"} Dec 17 09:23:17 crc kubenswrapper[4935]: I1217 09:23:17.417526 4935 generic.go:334] "Generic (PLEG): container finished" podID="e9d6a1ba-bf06-4798-87c6-8980f387fe14" containerID="6996a6849f33defe574dffd691d706db6665a93e36b385cf206811ab1230d102" exitCode=0 Dec 17 09:23:17 crc kubenswrapper[4935]: I1217 09:23:17.417600 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2dr9n" event={"ID":"e9d6a1ba-bf06-4798-87c6-8980f387fe14","Type":"ContainerDied","Data":"6996a6849f33defe574dffd691d706db6665a93e36b385cf206811ab1230d102"} Dec 17 09:23:18 crc kubenswrapper[4935]: I1217 09:23:18.982295 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d06-account-create-update-wwsz6" Dec 17 09:23:18 crc kubenswrapper[4935]: I1217 09:23:18.992469 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2dr9n" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.105936 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d6a1ba-bf06-4798-87c6-8980f387fe14-operator-scripts\") pod \"e9d6a1ba-bf06-4798-87c6-8980f387fe14\" (UID: \"e9d6a1ba-bf06-4798-87c6-8980f387fe14\") " Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.106226 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52b56\" (UniqueName: \"kubernetes.io/projected/cd983b91-5108-4717-a2d2-4324cbd041eb-kube-api-access-52b56\") pod \"cd983b91-5108-4717-a2d2-4324cbd041eb\" (UID: \"cd983b91-5108-4717-a2d2-4324cbd041eb\") " Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.106258 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd983b91-5108-4717-a2d2-4324cbd041eb-operator-scripts\") pod \"cd983b91-5108-4717-a2d2-4324cbd041eb\" (UID: \"cd983b91-5108-4717-a2d2-4324cbd041eb\") " Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.106418 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc6g5\" (UniqueName: \"kubernetes.io/projected/e9d6a1ba-bf06-4798-87c6-8980f387fe14-kube-api-access-vc6g5\") pod \"e9d6a1ba-bf06-4798-87c6-8980f387fe14\" (UID: \"e9d6a1ba-bf06-4798-87c6-8980f387fe14\") " Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.107227 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd983b91-5108-4717-a2d2-4324cbd041eb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd983b91-5108-4717-a2d2-4324cbd041eb" (UID: "cd983b91-5108-4717-a2d2-4324cbd041eb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.107266 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9d6a1ba-bf06-4798-87c6-8980f387fe14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9d6a1ba-bf06-4798-87c6-8980f387fe14" (UID: "e9d6a1ba-bf06-4798-87c6-8980f387fe14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.111850 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd983b91-5108-4717-a2d2-4324cbd041eb-kube-api-access-52b56" (OuterVolumeSpecName: "kube-api-access-52b56") pod "cd983b91-5108-4717-a2d2-4324cbd041eb" (UID: "cd983b91-5108-4717-a2d2-4324cbd041eb"). InnerVolumeSpecName "kube-api-access-52b56". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.112067 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9d6a1ba-bf06-4798-87c6-8980f387fe14-kube-api-access-vc6g5" (OuterVolumeSpecName: "kube-api-access-vc6g5") pod "e9d6a1ba-bf06-4798-87c6-8980f387fe14" (UID: "e9d6a1ba-bf06-4798-87c6-8980f387fe14"). InnerVolumeSpecName "kube-api-access-vc6g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.212897 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9d6a1ba-bf06-4798-87c6-8980f387fe14-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.212943 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52b56\" (UniqueName: \"kubernetes.io/projected/cd983b91-5108-4717-a2d2-4324cbd041eb-kube-api-access-52b56\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.213001 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd983b91-5108-4717-a2d2-4324cbd041eb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.213015 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc6g5\" (UniqueName: \"kubernetes.io/projected/e9d6a1ba-bf06-4798-87c6-8980f387fe14-kube-api-access-vc6g5\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.437674 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jdzl5" event={"ID":"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7","Type":"ContainerStarted","Data":"d3b0283abe723315e508dafa0a4ee1f7aa5ce6cf37532f4e584fd41f6db44619"} Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.441153 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-2dr9n" event={"ID":"e9d6a1ba-bf06-4798-87c6-8980f387fe14","Type":"ContainerDied","Data":"03df7effdf95f56343d23ad9cbc645f1797c1fe829a0dd2b28f268e059508a72"} Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.441183 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03df7effdf95f56343d23ad9cbc645f1797c1fe829a0dd2b28f268e059508a72" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.441245 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-2dr9n" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.442816 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-5d06-account-create-update-wwsz6" event={"ID":"cd983b91-5108-4717-a2d2-4324cbd041eb","Type":"ContainerDied","Data":"f0034f50d76085dc4924fe6ca4eaa5663036d438b11cffdd907f4b5105d4f892"} Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.442839 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0034f50d76085dc4924fe6ca4eaa5663036d438b11cffdd907f4b5105d4f892" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.442873 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-5d06-account-create-update-wwsz6" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.460177 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-7k42n"] Dec 17 09:23:19 crc kubenswrapper[4935]: E1217 09:23:19.460621 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd983b91-5108-4717-a2d2-4324cbd041eb" containerName="mariadb-account-create-update" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.460655 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd983b91-5108-4717-a2d2-4324cbd041eb" containerName="mariadb-account-create-update" Dec 17 09:23:19 crc kubenswrapper[4935]: E1217 09:23:19.460672 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9d6a1ba-bf06-4798-87c6-8980f387fe14" containerName="mariadb-database-create" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.460679 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9d6a1ba-bf06-4798-87c6-8980f387fe14" containerName="mariadb-database-create" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.460845 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9d6a1ba-bf06-4798-87c6-8980f387fe14" containerName="mariadb-database-create" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.460875 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd983b91-5108-4717-a2d2-4324cbd041eb" containerName="mariadb-account-create-update" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.461514 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7k42n" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.475671 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7k42n"] Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.482120 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-jdzl5" podStartSLOduration=2.337377187 podStartE2EDuration="6.482092957s" podCreationTimestamp="2025-12-17 09:23:13 +0000 UTC" firstStartedPulling="2025-12-17 09:23:14.680177698 +0000 UTC m=+1114.340018461" lastFinishedPulling="2025-12-17 09:23:18.824893468 +0000 UTC m=+1118.484734231" observedRunningTime="2025-12-17 09:23:19.469934831 +0000 UTC m=+1119.129775594" watchObservedRunningTime="2025-12-17 09:23:19.482092957 +0000 UTC m=+1119.141933720" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.565166 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-9dfe-account-create-update-wk5hj"] Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.566561 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9dfe-account-create-update-wk5hj" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.569310 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.577530 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9dfe-account-create-update-wk5hj"] Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.620237 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-operator-scripts\") pod \"keystone-db-create-7k42n\" (UID: \"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023\") " pod="openstack/keystone-db-create-7k42n" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.620665 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52cmk\" (UniqueName: \"kubernetes.io/projected/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-kube-api-access-52cmk\") pod \"keystone-db-create-7k42n\" (UID: \"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023\") " pod="openstack/keystone-db-create-7k42n" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.723342 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-operator-scripts\") pod \"keystone-db-create-7k42n\" (UID: \"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023\") " pod="openstack/keystone-db-create-7k42n" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.723905 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52cmk\" (UniqueName: \"kubernetes.io/projected/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-kube-api-access-52cmk\") pod \"keystone-db-create-7k42n\" (UID: \"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023\") " pod="openstack/keystone-db-create-7k42n" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.724075 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75555da0-8b97-47fa-8851-3adb9fa308ec-operator-scripts\") pod \"keystone-9dfe-account-create-update-wk5hj\" (UID: \"75555da0-8b97-47fa-8851-3adb9fa308ec\") " pod="openstack/keystone-9dfe-account-create-update-wk5hj" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.724217 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g29jq\" (UniqueName: \"kubernetes.io/projected/75555da0-8b97-47fa-8851-3adb9fa308ec-kube-api-access-g29jq\") pod \"keystone-9dfe-account-create-update-wk5hj\" (UID: \"75555da0-8b97-47fa-8851-3adb9fa308ec\") " pod="openstack/keystone-9dfe-account-create-update-wk5hj" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.726420 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-operator-scripts\") pod \"keystone-db-create-7k42n\" (UID: \"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023\") " pod="openstack/keystone-db-create-7k42n" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.749039 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52cmk\" (UniqueName: \"kubernetes.io/projected/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-kube-api-access-52cmk\") pod \"keystone-db-create-7k42n\" (UID: \"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023\") " pod="openstack/keystone-db-create-7k42n" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.764987 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ggjn4"] Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.768906 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ggjn4" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.776622 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ggjn4"] Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.778891 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7k42n" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.829213 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75555da0-8b97-47fa-8851-3adb9fa308ec-operator-scripts\") pod \"keystone-9dfe-account-create-update-wk5hj\" (UID: \"75555da0-8b97-47fa-8851-3adb9fa308ec\") " pod="openstack/keystone-9dfe-account-create-update-wk5hj" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.828194 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75555da0-8b97-47fa-8851-3adb9fa308ec-operator-scripts\") pod \"keystone-9dfe-account-create-update-wk5hj\" (UID: \"75555da0-8b97-47fa-8851-3adb9fa308ec\") " pod="openstack/keystone-9dfe-account-create-update-wk5hj" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.829551 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g29jq\" (UniqueName: \"kubernetes.io/projected/75555da0-8b97-47fa-8851-3adb9fa308ec-kube-api-access-g29jq\") pod \"keystone-9dfe-account-create-update-wk5hj\" (UID: \"75555da0-8b97-47fa-8851-3adb9fa308ec\") " pod="openstack/keystone-9dfe-account-create-update-wk5hj" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.849939 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g29jq\" (UniqueName: \"kubernetes.io/projected/75555da0-8b97-47fa-8851-3adb9fa308ec-kube-api-access-g29jq\") pod \"keystone-9dfe-account-create-update-wk5hj\" (UID: \"75555da0-8b97-47fa-8851-3adb9fa308ec\") " pod="openstack/keystone-9dfe-account-create-update-wk5hj" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.873401 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1418-account-create-update-lwlzp"] Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.874702 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1418-account-create-update-lwlzp" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.877025 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.882294 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1418-account-create-update-lwlzp"] Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.883488 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9dfe-account-create-update-wk5hj" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.932504 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw47b\" (UniqueName: \"kubernetes.io/projected/485a1677-2580-4128-8711-74c1136c0716-kube-api-access-fw47b\") pod \"placement-db-create-ggjn4\" (UID: \"485a1677-2580-4128-8711-74c1136c0716\") " pod="openstack/placement-db-create-ggjn4" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.932554 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ed76ae-907f-4693-a696-1d43ee6fb5e2-operator-scripts\") pod \"placement-1418-account-create-update-lwlzp\" (UID: \"71ed76ae-907f-4693-a696-1d43ee6fb5e2\") " pod="openstack/placement-1418-account-create-update-lwlzp" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.932575 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485a1677-2580-4128-8711-74c1136c0716-operator-scripts\") pod \"placement-db-create-ggjn4\" (UID: \"485a1677-2580-4128-8711-74c1136c0716\") " pod="openstack/placement-db-create-ggjn4" Dec 17 09:23:19 crc kubenswrapper[4935]: I1217 09:23:19.932673 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjbfc\" (UniqueName: \"kubernetes.io/projected/71ed76ae-907f-4693-a696-1d43ee6fb5e2-kube-api-access-mjbfc\") pod \"placement-1418-account-create-update-lwlzp\" (UID: \"71ed76ae-907f-4693-a696-1d43ee6fb5e2\") " pod="openstack/placement-1418-account-create-update-lwlzp" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.033393 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw47b\" (UniqueName: \"kubernetes.io/projected/485a1677-2580-4128-8711-74c1136c0716-kube-api-access-fw47b\") pod \"placement-db-create-ggjn4\" (UID: \"485a1677-2580-4128-8711-74c1136c0716\") " pod="openstack/placement-db-create-ggjn4" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.033876 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ed76ae-907f-4693-a696-1d43ee6fb5e2-operator-scripts\") pod \"placement-1418-account-create-update-lwlzp\" (UID: \"71ed76ae-907f-4693-a696-1d43ee6fb5e2\") " pod="openstack/placement-1418-account-create-update-lwlzp" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.033945 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485a1677-2580-4128-8711-74c1136c0716-operator-scripts\") pod \"placement-db-create-ggjn4\" (UID: \"485a1677-2580-4128-8711-74c1136c0716\") " pod="openstack/placement-db-create-ggjn4" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.034042 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjbfc\" (UniqueName: \"kubernetes.io/projected/71ed76ae-907f-4693-a696-1d43ee6fb5e2-kube-api-access-mjbfc\") pod \"placement-1418-account-create-update-lwlzp\" (UID: \"71ed76ae-907f-4693-a696-1d43ee6fb5e2\") " pod="openstack/placement-1418-account-create-update-lwlzp" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.035495 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ed76ae-907f-4693-a696-1d43ee6fb5e2-operator-scripts\") pod \"placement-1418-account-create-update-lwlzp\" (UID: \"71ed76ae-907f-4693-a696-1d43ee6fb5e2\") " pod="openstack/placement-1418-account-create-update-lwlzp" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.035592 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485a1677-2580-4128-8711-74c1136c0716-operator-scripts\") pod \"placement-db-create-ggjn4\" (UID: \"485a1677-2580-4128-8711-74c1136c0716\") " pod="openstack/placement-db-create-ggjn4" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.060147 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw47b\" (UniqueName: \"kubernetes.io/projected/485a1677-2580-4128-8711-74c1136c0716-kube-api-access-fw47b\") pod \"placement-db-create-ggjn4\" (UID: \"485a1677-2580-4128-8711-74c1136c0716\") " pod="openstack/placement-db-create-ggjn4" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.060222 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjbfc\" (UniqueName: \"kubernetes.io/projected/71ed76ae-907f-4693-a696-1d43ee6fb5e2-kube-api-access-mjbfc\") pod \"placement-1418-account-create-update-lwlzp\" (UID: \"71ed76ae-907f-4693-a696-1d43ee6fb5e2\") " pod="openstack/placement-1418-account-create-update-lwlzp" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.090214 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ggjn4" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.289049 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1418-account-create-update-lwlzp" Dec 17 09:23:20 crc kubenswrapper[4935]: W1217 09:23:20.319612 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98ee9cd1_7c28_49c0_89f6_d1f5e70ce023.slice/crio-1de06bbe1ab5d010ac81e5d31c03f18babc8b4d0ebb1bd8ef71004ad45af3713 WatchSource:0}: Error finding container 1de06bbe1ab5d010ac81e5d31c03f18babc8b4d0ebb1bd8ef71004ad45af3713: Status 404 returned error can't find the container with id 1de06bbe1ab5d010ac81e5d31c03f18babc8b4d0ebb1bd8ef71004ad45af3713 Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.326298 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-7k42n"] Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.428215 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-9dfe-account-create-update-wk5hj"] Dec 17 09:23:20 crc kubenswrapper[4935]: W1217 09:23:20.433166 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75555da0_8b97_47fa_8851_3adb9fa308ec.slice/crio-7ab0804c676548497540c64681dd314fb8221ccd7b1a0a8381aeae1a26c6c20f WatchSource:0}: Error finding container 7ab0804c676548497540c64681dd314fb8221ccd7b1a0a8381aeae1a26c6c20f: Status 404 returned error can't find the container with id 7ab0804c676548497540c64681dd314fb8221ccd7b1a0a8381aeae1a26c6c20f Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.456410 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9dfe-account-create-update-wk5hj" event={"ID":"75555da0-8b97-47fa-8851-3adb9fa308ec","Type":"ContainerStarted","Data":"7ab0804c676548497540c64681dd314fb8221ccd7b1a0a8381aeae1a26c6c20f"} Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.458295 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7k42n" event={"ID":"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023","Type":"ContainerStarted","Data":"1de06bbe1ab5d010ac81e5d31c03f18babc8b4d0ebb1bd8ef71004ad45af3713"} Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.535150 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-bmvch"] Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.536357 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.543011 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.543401 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2lgf4" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.558922 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-bmvch"] Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.579789 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ggjn4"] Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.645732 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-combined-ca-bundle\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.645801 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4svc5\" (UniqueName: \"kubernetes.io/projected/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-kube-api-access-4svc5\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.646524 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-config-data\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.646637 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-db-sync-config-data\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.749315 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-combined-ca-bundle\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.749422 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4svc5\" (UniqueName: \"kubernetes.io/projected/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-kube-api-access-4svc5\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.749490 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-config-data\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.749541 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-db-sync-config-data\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.756554 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-combined-ca-bundle\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.757421 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-db-sync-config-data\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.759938 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-config-data\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.764484 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1418-account-create-update-lwlzp"] Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.771829 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4svc5\" (UniqueName: \"kubernetes.io/projected/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-kube-api-access-4svc5\") pod \"glance-db-sync-bmvch\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:20 crc kubenswrapper[4935]: W1217 09:23:20.786154 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71ed76ae_907f_4693_a696_1d43ee6fb5e2.slice/crio-074088530c85fa3b2c0c66124594893a9ab89e3cf1fc4d69295bb6aef439e583 WatchSource:0}: Error finding container 074088530c85fa3b2c0c66124594893a9ab89e3cf1fc4d69295bb6aef439e583: Status 404 returned error can't find the container with id 074088530c85fa3b2c0c66124594893a9ab89e3cf1fc4d69295bb6aef439e583 Dec 17 09:23:20 crc kubenswrapper[4935]: I1217 09:23:20.888236 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.159096 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:21 crc kubenswrapper[4935]: E1217 09:23:21.159476 4935 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 17 09:23:21 crc kubenswrapper[4935]: E1217 09:23:21.163254 4935 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 17 09:23:21 crc kubenswrapper[4935]: E1217 09:23:21.163943 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift podName:a2c97a57-ac68-4fab-acbd-ecdec8db5fb5 nodeName:}" failed. No retries permitted until 2025-12-17 09:23:29.16391264 +0000 UTC m=+1128.823753403 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift") pod "swift-storage-0" (UID: "a2c97a57-ac68-4fab-acbd-ecdec8db5fb5") : configmap "swift-ring-files" not found Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.468818 4935 generic.go:334] "Generic (PLEG): container finished" podID="75555da0-8b97-47fa-8851-3adb9fa308ec" containerID="0f2fcc7c08fda2f08c8fbd662c24b964a943604b883d8d6e1d9385fde2d16777" exitCode=0 Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.468954 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9dfe-account-create-update-wk5hj" event={"ID":"75555da0-8b97-47fa-8851-3adb9fa308ec","Type":"ContainerDied","Data":"0f2fcc7c08fda2f08c8fbd662c24b964a943604b883d8d6e1d9385fde2d16777"} Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.471803 4935 generic.go:334] "Generic (PLEG): container finished" podID="485a1677-2580-4128-8711-74c1136c0716" containerID="ffbc25a3cbb476037b3e5fb9136fc2ba3f2a832e84ed3f1f0228045f7688ef1f" exitCode=0 Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.471848 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ggjn4" event={"ID":"485a1677-2580-4128-8711-74c1136c0716","Type":"ContainerDied","Data":"ffbc25a3cbb476037b3e5fb9136fc2ba3f2a832e84ed3f1f0228045f7688ef1f"} Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.471889 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ggjn4" event={"ID":"485a1677-2580-4128-8711-74c1136c0716","Type":"ContainerStarted","Data":"11f506b9c2edded2ae3209ef1f3c78b70ceaf1b41aa64eb501e2eabec6911956"} Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.473986 4935 generic.go:334] "Generic (PLEG): container finished" podID="98ee9cd1-7c28-49c0-89f6-d1f5e70ce023" containerID="b3c6a77bd41056590ad41afbad2be22148fc25ed8311caf463e52a2357372be8" exitCode=0 Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.474037 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7k42n" event={"ID":"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023","Type":"ContainerDied","Data":"b3c6a77bd41056590ad41afbad2be22148fc25ed8311caf463e52a2357372be8"} Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.476474 4935 generic.go:334] "Generic (PLEG): container finished" podID="71ed76ae-907f-4693-a696-1d43ee6fb5e2" containerID="756d7751af282a1839230ccb7967e446b7b0e2dbeeee06d542db49825558de89" exitCode=0 Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.476535 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1418-account-create-update-lwlzp" event={"ID":"71ed76ae-907f-4693-a696-1d43ee6fb5e2","Type":"ContainerDied","Data":"756d7751af282a1839230ccb7967e446b7b0e2dbeeee06d542db49825558de89"} Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.476580 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1418-account-create-update-lwlzp" event={"ID":"71ed76ae-907f-4693-a696-1d43ee6fb5e2","Type":"ContainerStarted","Data":"074088530c85fa3b2c0c66124594893a9ab89e3cf1fc4d69295bb6aef439e583"} Dec 17 09:23:21 crc kubenswrapper[4935]: I1217 09:23:21.494102 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-bmvch"] Dec 17 09:23:22 crc kubenswrapper[4935]: I1217 09:23:22.485590 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bmvch" event={"ID":"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07","Type":"ContainerStarted","Data":"96dfd812d9079f6e9ed5203ea424357401a538ad4ee7e87fd928c2b2053ae6d7"} Dec 17 09:23:22 crc kubenswrapper[4935]: I1217 09:23:22.495785 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:23:22 crc kubenswrapper[4935]: I1217 09:23:22.556392 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-6tqh7"] Dec 17 09:23:22 crc kubenswrapper[4935]: I1217 09:23:22.556684 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" podUID="df393918-e2ab-4597-b1c6-acdc7034a6d3" containerName="dnsmasq-dns" containerID="cri-o://7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90" gracePeriod=10 Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.006701 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7k42n" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.107305 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-operator-scripts\") pod \"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023\" (UID: \"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.108225 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "98ee9cd1-7c28-49c0-89f6-d1f5e70ce023" (UID: "98ee9cd1-7c28-49c0-89f6-d1f5e70ce023"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.108459 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52cmk\" (UniqueName: \"kubernetes.io/projected/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-kube-api-access-52cmk\") pod \"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023\" (UID: \"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.109551 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.117215 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-kube-api-access-52cmk" (OuterVolumeSpecName: "kube-api-access-52cmk") pod "98ee9cd1-7c28-49c0-89f6-d1f5e70ce023" (UID: "98ee9cd1-7c28-49c0-89f6-d1f5e70ce023"). InnerVolumeSpecName "kube-api-access-52cmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.215612 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52cmk\" (UniqueName: \"kubernetes.io/projected/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023-kube-api-access-52cmk\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.250792 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9dfe-account-create-update-wk5hj" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.262179 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ggjn4" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.288214 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1418-account-create-update-lwlzp" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.411136 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.422991 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ed76ae-907f-4693-a696-1d43ee6fb5e2-operator-scripts\") pod \"71ed76ae-907f-4693-a696-1d43ee6fb5e2\" (UID: \"71ed76ae-907f-4693-a696-1d43ee6fb5e2\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.423189 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485a1677-2580-4128-8711-74c1136c0716-operator-scripts\") pod \"485a1677-2580-4128-8711-74c1136c0716\" (UID: \"485a1677-2580-4128-8711-74c1136c0716\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.423292 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75555da0-8b97-47fa-8851-3adb9fa308ec-operator-scripts\") pod \"75555da0-8b97-47fa-8851-3adb9fa308ec\" (UID: \"75555da0-8b97-47fa-8851-3adb9fa308ec\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.423320 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjbfc\" (UniqueName: \"kubernetes.io/projected/71ed76ae-907f-4693-a696-1d43ee6fb5e2-kube-api-access-mjbfc\") pod \"71ed76ae-907f-4693-a696-1d43ee6fb5e2\" (UID: \"71ed76ae-907f-4693-a696-1d43ee6fb5e2\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.423385 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw47b\" (UniqueName: \"kubernetes.io/projected/485a1677-2580-4128-8711-74c1136c0716-kube-api-access-fw47b\") pod \"485a1677-2580-4128-8711-74c1136c0716\" (UID: \"485a1677-2580-4128-8711-74c1136c0716\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.423438 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g29jq\" (UniqueName: \"kubernetes.io/projected/75555da0-8b97-47fa-8851-3adb9fa308ec-kube-api-access-g29jq\") pod \"75555da0-8b97-47fa-8851-3adb9fa308ec\" (UID: \"75555da0-8b97-47fa-8851-3adb9fa308ec\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.424321 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/485a1677-2580-4128-8711-74c1136c0716-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "485a1677-2580-4128-8711-74c1136c0716" (UID: "485a1677-2580-4128-8711-74c1136c0716"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.424341 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71ed76ae-907f-4693-a696-1d43ee6fb5e2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "71ed76ae-907f-4693-a696-1d43ee6fb5e2" (UID: "71ed76ae-907f-4693-a696-1d43ee6fb5e2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.424705 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75555da0-8b97-47fa-8851-3adb9fa308ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "75555da0-8b97-47fa-8851-3adb9fa308ec" (UID: "75555da0-8b97-47fa-8851-3adb9fa308ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.431673 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ed76ae-907f-4693-a696-1d43ee6fb5e2-kube-api-access-mjbfc" (OuterVolumeSpecName: "kube-api-access-mjbfc") pod "71ed76ae-907f-4693-a696-1d43ee6fb5e2" (UID: "71ed76ae-907f-4693-a696-1d43ee6fb5e2"). InnerVolumeSpecName "kube-api-access-mjbfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.437286 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485a1677-2580-4128-8711-74c1136c0716-kube-api-access-fw47b" (OuterVolumeSpecName: "kube-api-access-fw47b") pod "485a1677-2580-4128-8711-74c1136c0716" (UID: "485a1677-2580-4128-8711-74c1136c0716"). InnerVolumeSpecName "kube-api-access-fw47b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.445793 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75555da0-8b97-47fa-8851-3adb9fa308ec-kube-api-access-g29jq" (OuterVolumeSpecName: "kube-api-access-g29jq") pod "75555da0-8b97-47fa-8851-3adb9fa308ec" (UID: "75555da0-8b97-47fa-8851-3adb9fa308ec"). InnerVolumeSpecName "kube-api-access-g29jq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.513388 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ggjn4" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.514923 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ggjn4" event={"ID":"485a1677-2580-4128-8711-74c1136c0716","Type":"ContainerDied","Data":"11f506b9c2edded2ae3209ef1f3c78b70ceaf1b41aa64eb501e2eabec6911956"} Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.515006 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11f506b9c2edded2ae3209ef1f3c78b70ceaf1b41aa64eb501e2eabec6911956" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.517473 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-7k42n" event={"ID":"98ee9cd1-7c28-49c0-89f6-d1f5e70ce023","Type":"ContainerDied","Data":"1de06bbe1ab5d010ac81e5d31c03f18babc8b4d0ebb1bd8ef71004ad45af3713"} Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.517513 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-7k42n" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.517769 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1de06bbe1ab5d010ac81e5d31c03f18babc8b4d0ebb1bd8ef71004ad45af3713" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.519858 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1418-account-create-update-lwlzp" event={"ID":"71ed76ae-907f-4693-a696-1d43ee6fb5e2","Type":"ContainerDied","Data":"074088530c85fa3b2c0c66124594893a9ab89e3cf1fc4d69295bb6aef439e583"} Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.519924 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="074088530c85fa3b2c0c66124594893a9ab89e3cf1fc4d69295bb6aef439e583" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.520036 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1418-account-create-update-lwlzp" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.523846 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-9dfe-account-create-update-wk5hj" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.523868 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-9dfe-account-create-update-wk5hj" event={"ID":"75555da0-8b97-47fa-8851-3adb9fa308ec","Type":"ContainerDied","Data":"7ab0804c676548497540c64681dd314fb8221ccd7b1a0a8381aeae1a26c6c20f"} Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.523938 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ab0804c676548497540c64681dd314fb8221ccd7b1a0a8381aeae1a26c6c20f" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.524630 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfgf5\" (UniqueName: \"kubernetes.io/projected/df393918-e2ab-4597-b1c6-acdc7034a6d3-kube-api-access-tfgf5\") pod \"df393918-e2ab-4597-b1c6-acdc7034a6d3\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.524782 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-dns-svc\") pod \"df393918-e2ab-4597-b1c6-acdc7034a6d3\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.524900 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-config\") pod \"df393918-e2ab-4597-b1c6-acdc7034a6d3\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.525032 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-ovsdbserver-sb\") pod \"df393918-e2ab-4597-b1c6-acdc7034a6d3\" (UID: \"df393918-e2ab-4597-b1c6-acdc7034a6d3\") " Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.525620 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/75555da0-8b97-47fa-8851-3adb9fa308ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.525640 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjbfc\" (UniqueName: \"kubernetes.io/projected/71ed76ae-907f-4693-a696-1d43ee6fb5e2-kube-api-access-mjbfc\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.525658 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw47b\" (UniqueName: \"kubernetes.io/projected/485a1677-2580-4128-8711-74c1136c0716-kube-api-access-fw47b\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.525670 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g29jq\" (UniqueName: \"kubernetes.io/projected/75555da0-8b97-47fa-8851-3adb9fa308ec-kube-api-access-g29jq\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.525682 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/71ed76ae-907f-4693-a696-1d43ee6fb5e2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.525695 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/485a1677-2580-4128-8711-74c1136c0716-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.530483 4935 generic.go:334] "Generic (PLEG): container finished" podID="df393918-e2ab-4597-b1c6-acdc7034a6d3" containerID="7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90" exitCode=0 Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.530533 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" event={"ID":"df393918-e2ab-4597-b1c6-acdc7034a6d3","Type":"ContainerDied","Data":"7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90"} Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.530565 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" event={"ID":"df393918-e2ab-4597-b1c6-acdc7034a6d3","Type":"ContainerDied","Data":"e80195584d704e9e7b84c1b8eab35f446c63fadea1a9540c0f17def8e6b13808"} Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.530584 4935 scope.go:117] "RemoveContainer" containerID="7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.530712 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-6tqh7" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.530974 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df393918-e2ab-4597-b1c6-acdc7034a6d3-kube-api-access-tfgf5" (OuterVolumeSpecName: "kube-api-access-tfgf5") pod "df393918-e2ab-4597-b1c6-acdc7034a6d3" (UID: "df393918-e2ab-4597-b1c6-acdc7034a6d3"). InnerVolumeSpecName "kube-api-access-tfgf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.574548 4935 scope.go:117] "RemoveContainer" containerID="fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.589899 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-config" (OuterVolumeSpecName: "config") pod "df393918-e2ab-4597-b1c6-acdc7034a6d3" (UID: "df393918-e2ab-4597-b1c6-acdc7034a6d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.601388 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "df393918-e2ab-4597-b1c6-acdc7034a6d3" (UID: "df393918-e2ab-4597-b1c6-acdc7034a6d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.602327 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "df393918-e2ab-4597-b1c6-acdc7034a6d3" (UID: "df393918-e2ab-4597-b1c6-acdc7034a6d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.603886 4935 scope.go:117] "RemoveContainer" containerID="7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90" Dec 17 09:23:23 crc kubenswrapper[4935]: E1217 09:23:23.605036 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90\": container with ID starting with 7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90 not found: ID does not exist" containerID="7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.605090 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90"} err="failed to get container status \"7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90\": rpc error: code = NotFound desc = could not find container \"7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90\": container with ID starting with 7ca8cbd74e7728a8bcf31e440f4e8f62aa94fd3f1e3fcc69b5eb0bd5fb581b90 not found: ID does not exist" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.605128 4935 scope.go:117] "RemoveContainer" containerID="fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467" Dec 17 09:23:23 crc kubenswrapper[4935]: E1217 09:23:23.605702 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467\": container with ID starting with fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467 not found: ID does not exist" containerID="fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.605748 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467"} err="failed to get container status \"fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467\": rpc error: code = NotFound desc = could not find container \"fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467\": container with ID starting with fdd28c8ddd8cb108719f7be238f0217bf1c656e7f286c66fa065675742d18467 not found: ID does not exist" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.630932 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfgf5\" (UniqueName: \"kubernetes.io/projected/df393918-e2ab-4597-b1c6-acdc7034a6d3-kube-api-access-tfgf5\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.631648 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.631658 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.631670 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/df393918-e2ab-4597-b1c6-acdc7034a6d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.883130 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-6tqh7"] Dec 17 09:23:23 crc kubenswrapper[4935]: I1217 09:23:23.891077 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-6tqh7"] Dec 17 09:23:25 crc kubenswrapper[4935]: I1217 09:23:25.138047 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df393918-e2ab-4597-b1c6-acdc7034a6d3" path="/var/lib/kubelet/pods/df393918-e2ab-4597-b1c6-acdc7034a6d3/volumes" Dec 17 09:23:25 crc kubenswrapper[4935]: I1217 09:23:25.555568 4935 generic.go:334] "Generic (PLEG): container finished" podID="90598781-7630-4807-9735-eb1cfaba2927" containerID="60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c" exitCode=0 Dec 17 09:23:25 crc kubenswrapper[4935]: I1217 09:23:25.555651 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90598781-7630-4807-9735-eb1cfaba2927","Type":"ContainerDied","Data":"60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c"} Dec 17 09:23:25 crc kubenswrapper[4935]: I1217 09:23:25.558367 4935 generic.go:334] "Generic (PLEG): container finished" podID="7c863e0e-b041-4e68-852f-addc7126a215" containerID="ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515" exitCode=0 Dec 17 09:23:25 crc kubenswrapper[4935]: I1217 09:23:25.558402 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c863e0e-b041-4e68-852f-addc7126a215","Type":"ContainerDied","Data":"ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515"} Dec 17 09:23:26 crc kubenswrapper[4935]: I1217 09:23:26.568756 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c863e0e-b041-4e68-852f-addc7126a215","Type":"ContainerStarted","Data":"ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2"} Dec 17 09:23:26 crc kubenswrapper[4935]: I1217 09:23:26.569530 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 17 09:23:26 crc kubenswrapper[4935]: I1217 09:23:26.573803 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90598781-7630-4807-9735-eb1cfaba2927","Type":"ContainerStarted","Data":"83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8"} Dec 17 09:23:26 crc kubenswrapper[4935]: I1217 09:23:26.574095 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:23:26 crc kubenswrapper[4935]: I1217 09:23:26.598343 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.929052868 podStartE2EDuration="1m1.598312987s" podCreationTimestamp="2025-12-17 09:22:25 +0000 UTC" firstStartedPulling="2025-12-17 09:22:27.754928782 +0000 UTC m=+1067.414769545" lastFinishedPulling="2025-12-17 09:22:51.424188901 +0000 UTC m=+1091.084029664" observedRunningTime="2025-12-17 09:23:26.591132042 +0000 UTC m=+1126.250972805" watchObservedRunningTime="2025-12-17 09:23:26.598312987 +0000 UTC m=+1126.258153750" Dec 17 09:23:26 crc kubenswrapper[4935]: I1217 09:23:26.625401 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.434563545 podStartE2EDuration="1m0.625378928s" podCreationTimestamp="2025-12-17 09:22:26 +0000 UTC" firstStartedPulling="2025-12-17 09:22:28.248744293 +0000 UTC m=+1067.908585056" lastFinishedPulling="2025-12-17 09:22:51.439559676 +0000 UTC m=+1091.099400439" observedRunningTime="2025-12-17 09:23:26.617758002 +0000 UTC m=+1126.277598765" watchObservedRunningTime="2025-12-17 09:23:26.625378928 +0000 UTC m=+1126.285219691" Dec 17 09:23:27 crc kubenswrapper[4935]: I1217 09:23:27.213390 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 17 09:23:27 crc kubenswrapper[4935]: I1217 09:23:27.582820 4935 generic.go:334] "Generic (PLEG): container finished" podID="499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" containerID="d3b0283abe723315e508dafa0a4ee1f7aa5ce6cf37532f4e584fd41f6db44619" exitCode=0 Dec 17 09:23:27 crc kubenswrapper[4935]: I1217 09:23:27.583923 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jdzl5" event={"ID":"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7","Type":"ContainerDied","Data":"d3b0283abe723315e508dafa0a4ee1f7aa5ce6cf37532f4e584fd41f6db44619"} Dec 17 09:23:29 crc kubenswrapper[4935]: I1217 09:23:29.247539 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:29 crc kubenswrapper[4935]: I1217 09:23:29.257060 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a2c97a57-ac68-4fab-acbd-ecdec8db5fb5-etc-swift\") pod \"swift-storage-0\" (UID: \"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5\") " pod="openstack/swift-storage-0" Dec 17 09:23:29 crc kubenswrapper[4935]: I1217 09:23:29.512382 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 17 09:23:30 crc kubenswrapper[4935]: I1217 09:23:30.130791 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:23:30 crc kubenswrapper[4935]: I1217 09:23:30.130861 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:23:30 crc kubenswrapper[4935]: I1217 09:23:30.311789 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-l4zph" podUID="c1e01a1b-9baa-4738-8e44-b206863b4d3d" containerName="ovn-controller" probeResult="failure" output=< Dec 17 09:23:30 crc kubenswrapper[4935]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 17 09:23:30 crc kubenswrapper[4935]: > Dec 17 09:23:30 crc kubenswrapper[4935]: I1217 09:23:30.322327 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.232135 4935 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podf25c22f0-aa41-47d9-ab2b-e3cdc9394f53"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podf25c22f0-aa41-47d9-ab2b-e3cdc9394f53] : Timed out while waiting for systemd to remove kubepods-besteffort-podf25c22f0_aa41_47d9_ab2b_e3cdc9394f53.slice" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.627230 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-l4zph" podUID="c1e01a1b-9baa-4738-8e44-b206863b4d3d" containerName="ovn-controller" probeResult="failure" output=< Dec 17 09:23:35 crc kubenswrapper[4935]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 17 09:23:35 crc kubenswrapper[4935]: > Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.629243 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jhrvs" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.882572 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-l4zph-config-w9tq7"] Dec 17 09:23:35 crc kubenswrapper[4935]: E1217 09:23:35.882984 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ed76ae-907f-4693-a696-1d43ee6fb5e2" containerName="mariadb-account-create-update" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.883002 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ed76ae-907f-4693-a696-1d43ee6fb5e2" containerName="mariadb-account-create-update" Dec 17 09:23:35 crc kubenswrapper[4935]: E1217 09:23:35.889353 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485a1677-2580-4128-8711-74c1136c0716" containerName="mariadb-database-create" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.889390 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="485a1677-2580-4128-8711-74c1136c0716" containerName="mariadb-database-create" Dec 17 09:23:35 crc kubenswrapper[4935]: E1217 09:23:35.889426 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df393918-e2ab-4597-b1c6-acdc7034a6d3" containerName="init" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.889433 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="df393918-e2ab-4597-b1c6-acdc7034a6d3" containerName="init" Dec 17 09:23:35 crc kubenswrapper[4935]: E1217 09:23:35.889442 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df393918-e2ab-4597-b1c6-acdc7034a6d3" containerName="dnsmasq-dns" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.889448 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="df393918-e2ab-4597-b1c6-acdc7034a6d3" containerName="dnsmasq-dns" Dec 17 09:23:35 crc kubenswrapper[4935]: E1217 09:23:35.889463 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98ee9cd1-7c28-49c0-89f6-d1f5e70ce023" containerName="mariadb-database-create" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.889474 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="98ee9cd1-7c28-49c0-89f6-d1f5e70ce023" containerName="mariadb-database-create" Dec 17 09:23:35 crc kubenswrapper[4935]: E1217 09:23:35.889496 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75555da0-8b97-47fa-8851-3adb9fa308ec" containerName="mariadb-account-create-update" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.889501 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="75555da0-8b97-47fa-8851-3adb9fa308ec" containerName="mariadb-account-create-update" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.889800 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="485a1677-2580-4128-8711-74c1136c0716" containerName="mariadb-database-create" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.889828 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ed76ae-907f-4693-a696-1d43ee6fb5e2" containerName="mariadb-account-create-update" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.889837 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="df393918-e2ab-4597-b1c6-acdc7034a6d3" containerName="dnsmasq-dns" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.889847 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="98ee9cd1-7c28-49c0-89f6-d1f5e70ce023" containerName="mariadb-database-create" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.889854 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="75555da0-8b97-47fa-8851-3adb9fa308ec" containerName="mariadb-account-create-update" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.890497 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.893008 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.905103 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l4zph-config-w9tq7"] Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.979596 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-scripts\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.979651 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jwf\" (UniqueName: \"kubernetes.io/projected/1ec63236-123e-444d-b79c-4b0bb2dba193-kube-api-access-92jwf\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.979676 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.979716 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-additional-scripts\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.979783 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run-ovn\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:35 crc kubenswrapper[4935]: I1217 09:23:35.979816 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-log-ovn\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.081528 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run-ovn\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.081589 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-log-ovn\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.081653 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-scripts\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.081679 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92jwf\" (UniqueName: \"kubernetes.io/projected/1ec63236-123e-444d-b79c-4b0bb2dba193-kube-api-access-92jwf\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.081703 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.081746 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-additional-scripts\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.081993 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-log-ovn\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.082014 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.082244 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run-ovn\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.082651 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-additional-scripts\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.092657 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-scripts\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.141136 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jwf\" (UniqueName: \"kubernetes.io/projected/1ec63236-123e-444d-b79c-4b0bb2dba193-kube-api-access-92jwf\") pod \"ovn-controller-l4zph-config-w9tq7\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:36 crc kubenswrapper[4935]: I1217 09:23:36.218819 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.039219 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.176112 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.204386 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-combined-ca-bundle\") pod \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.204452 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-swiftconf\") pod \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.204652 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-scripts\") pod \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.204677 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk8p9\" (UniqueName: \"kubernetes.io/projected/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-kube-api-access-hk8p9\") pod \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.204729 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-ring-data-devices\") pod \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.204767 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-dispersionconf\") pod \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.204831 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-etc-swift\") pod \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\" (UID: \"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7\") " Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.231024 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" (UID: "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.231918 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" (UID: "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.256896 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-kube-api-access-hk8p9" (OuterVolumeSpecName: "kube-api-access-hk8p9") pod "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" (UID: "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7"). InnerVolumeSpecName "kube-api-access-hk8p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.275710 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" (UID: "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.276727 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" (UID: "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.281453 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" (UID: "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.304657 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-scripts" (OuterVolumeSpecName: "scripts") pod "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" (UID: "499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.309074 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.309147 4935 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.309159 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.309167 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk8p9\" (UniqueName: \"kubernetes.io/projected/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-kube-api-access-hk8p9\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.309179 4935 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.309186 4935 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.309194 4935 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.533515 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.615400 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-z2qrd"] Dec 17 09:23:37 crc kubenswrapper[4935]: E1217 09:23:37.615851 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" containerName="swift-ring-rebalance" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.615874 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" containerName="swift-ring-rebalance" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.616049 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7" containerName="swift-ring-rebalance" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.626974 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z2qrd" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.644584 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-l4zph-config-w9tq7"] Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.655621 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-z2qrd"] Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.705519 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-4rx9f"] Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.706669 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4rx9f" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.718398 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8v78\" (UniqueName: \"kubernetes.io/projected/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-kube-api-access-g8v78\") pod \"cinder-db-create-z2qrd\" (UID: \"454cfd39-9bc9-4e7e-968b-7f4b67654fb0\") " pod="openstack/cinder-db-create-z2qrd" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.718481 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-operator-scripts\") pod \"cinder-db-create-z2qrd\" (UID: \"454cfd39-9bc9-4e7e-968b-7f4b67654fb0\") " pod="openstack/cinder-db-create-z2qrd" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.718570 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4rx9f"] Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.755726 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-jdzl5" event={"ID":"499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7","Type":"ContainerDied","Data":"d16f7fdc2205c93e1649ad0288c7508ec3323992797156298d479f4174da7602"} Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.755789 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d16f7fdc2205c93e1649ad0288c7508ec3323992797156298d479f4174da7602" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.755916 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-jdzl5" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.800046 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l4zph-config-w9tq7" event={"ID":"1ec63236-123e-444d-b79c-4b0bb2dba193","Type":"ContainerStarted","Data":"9ed0c749785657a69b8e5db53f614aee97166039f4622907266e0a030be383e7"} Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.819229 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-fcac-account-create-update-xwfk2"] Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.821573 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cb24d5e-88f0-4613-9cad-16f2f3717cae-operator-scripts\") pod \"barbican-db-create-4rx9f\" (UID: \"9cb24d5e-88f0-4613-9cad-16f2f3717cae\") " pod="openstack/barbican-db-create-4rx9f" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.821653 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbpfz\" (UniqueName: \"kubernetes.io/projected/9cb24d5e-88f0-4613-9cad-16f2f3717cae-kube-api-access-lbpfz\") pod \"barbican-db-create-4rx9f\" (UID: \"9cb24d5e-88f0-4613-9cad-16f2f3717cae\") " pod="openstack/barbican-db-create-4rx9f" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.821714 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8v78\" (UniqueName: \"kubernetes.io/projected/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-kube-api-access-g8v78\") pod \"cinder-db-create-z2qrd\" (UID: \"454cfd39-9bc9-4e7e-968b-7f4b67654fb0\") " pod="openstack/cinder-db-create-z2qrd" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.821777 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-operator-scripts\") pod \"cinder-db-create-z2qrd\" (UID: \"454cfd39-9bc9-4e7e-968b-7f4b67654fb0\") " pod="openstack/cinder-db-create-z2qrd" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.822884 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-operator-scripts\") pod \"cinder-db-create-z2qrd\" (UID: \"454cfd39-9bc9-4e7e-968b-7f4b67654fb0\") " pod="openstack/cinder-db-create-z2qrd" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.824227 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fcac-account-create-update-xwfk2" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.831992 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fcac-account-create-update-xwfk2"] Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.832531 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.888114 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8v78\" (UniqueName: \"kubernetes.io/projected/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-kube-api-access-g8v78\") pod \"cinder-db-create-z2qrd\" (UID: \"454cfd39-9bc9-4e7e-968b-7f4b67654fb0\") " pod="openstack/cinder-db-create-z2qrd" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.910350 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-a108-account-create-update-5gp8l"] Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.911800 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a108-account-create-update-5gp8l" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.916478 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.924353 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cb24d5e-88f0-4613-9cad-16f2f3717cae-operator-scripts\") pod \"barbican-db-create-4rx9f\" (UID: \"9cb24d5e-88f0-4613-9cad-16f2f3717cae\") " pod="openstack/barbican-db-create-4rx9f" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.924500 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-operator-scripts\") pod \"cinder-fcac-account-create-update-xwfk2\" (UID: \"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1\") " pod="openstack/cinder-fcac-account-create-update-xwfk2" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.924549 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvdzk\" (UniqueName: \"kubernetes.io/projected/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-kube-api-access-qvdzk\") pod \"cinder-fcac-account-create-update-xwfk2\" (UID: \"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1\") " pod="openstack/cinder-fcac-account-create-update-xwfk2" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.924570 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbpfz\" (UniqueName: \"kubernetes.io/projected/9cb24d5e-88f0-4613-9cad-16f2f3717cae-kube-api-access-lbpfz\") pod \"barbican-db-create-4rx9f\" (UID: \"9cb24d5e-88f0-4613-9cad-16f2f3717cae\") " pod="openstack/barbican-db-create-4rx9f" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.924683 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.924852 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a108-account-create-update-5gp8l"] Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.925859 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cb24d5e-88f0-4613-9cad-16f2f3717cae-operator-scripts\") pod \"barbican-db-create-4rx9f\" (UID: \"9cb24d5e-88f0-4613-9cad-16f2f3717cae\") " pod="openstack/barbican-db-create-4rx9f" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.954947 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z2qrd" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.960139 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbpfz\" (UniqueName: \"kubernetes.io/projected/9cb24d5e-88f0-4613-9cad-16f2f3717cae-kube-api-access-lbpfz\") pod \"barbican-db-create-4rx9f\" (UID: \"9cb24d5e-88f0-4613-9cad-16f2f3717cae\") " pod="openstack/barbican-db-create-4rx9f" Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.997905 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-kxwws"] Dec 17 09:23:37 crc kubenswrapper[4935]: I1217 09:23:37.999097 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kxwws" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.016806 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-kxwws"] Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.026378 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-operator-scripts\") pod \"cinder-fcac-account-create-update-xwfk2\" (UID: \"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1\") " pod="openstack/cinder-fcac-account-create-update-xwfk2" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.026464 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvdzk\" (UniqueName: \"kubernetes.io/projected/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-kube-api-access-qvdzk\") pod \"cinder-fcac-account-create-update-xwfk2\" (UID: \"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1\") " pod="openstack/cinder-fcac-account-create-update-xwfk2" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.026556 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sffj8\" (UniqueName: \"kubernetes.io/projected/cd84fa06-5321-450f-b602-7f09d571a6d6-kube-api-access-sffj8\") pod \"neutron-db-create-kxwws\" (UID: \"cd84fa06-5321-450f-b602-7f09d571a6d6\") " pod="openstack/neutron-db-create-kxwws" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.026608 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6df98a0-3831-48eb-8a28-648be6ec3b08-operator-scripts\") pod \"barbican-a108-account-create-update-5gp8l\" (UID: \"f6df98a0-3831-48eb-8a28-648be6ec3b08\") " pod="openstack/barbican-a108-account-create-update-5gp8l" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.026652 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd84fa06-5321-450f-b602-7f09d571a6d6-operator-scripts\") pod \"neutron-db-create-kxwws\" (UID: \"cd84fa06-5321-450f-b602-7f09d571a6d6\") " pod="openstack/neutron-db-create-kxwws" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.026702 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsxp5\" (UniqueName: \"kubernetes.io/projected/f6df98a0-3831-48eb-8a28-648be6ec3b08-kube-api-access-qsxp5\") pod \"barbican-a108-account-create-update-5gp8l\" (UID: \"f6df98a0-3831-48eb-8a28-648be6ec3b08\") " pod="openstack/barbican-a108-account-create-update-5gp8l" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.035878 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-operator-scripts\") pod \"cinder-fcac-account-create-update-xwfk2\" (UID: \"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1\") " pod="openstack/cinder-fcac-account-create-update-xwfk2" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.054940 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4rx9f" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.064038 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvdzk\" (UniqueName: \"kubernetes.io/projected/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-kube-api-access-qvdzk\") pod \"cinder-fcac-account-create-update-xwfk2\" (UID: \"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1\") " pod="openstack/cinder-fcac-account-create-update-xwfk2" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.144786 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sffj8\" (UniqueName: \"kubernetes.io/projected/cd84fa06-5321-450f-b602-7f09d571a6d6-kube-api-access-sffj8\") pod \"neutron-db-create-kxwws\" (UID: \"cd84fa06-5321-450f-b602-7f09d571a6d6\") " pod="openstack/neutron-db-create-kxwws" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.144847 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6df98a0-3831-48eb-8a28-648be6ec3b08-operator-scripts\") pod \"barbican-a108-account-create-update-5gp8l\" (UID: \"f6df98a0-3831-48eb-8a28-648be6ec3b08\") " pod="openstack/barbican-a108-account-create-update-5gp8l" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.144878 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd84fa06-5321-450f-b602-7f09d571a6d6-operator-scripts\") pod \"neutron-db-create-kxwws\" (UID: \"cd84fa06-5321-450f-b602-7f09d571a6d6\") " pod="openstack/neutron-db-create-kxwws" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.144907 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsxp5\" (UniqueName: \"kubernetes.io/projected/f6df98a0-3831-48eb-8a28-648be6ec3b08-kube-api-access-qsxp5\") pod \"barbican-a108-account-create-update-5gp8l\" (UID: \"f6df98a0-3831-48eb-8a28-648be6ec3b08\") " pod="openstack/barbican-a108-account-create-update-5gp8l" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.145342 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-t4kgb"] Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.146089 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6df98a0-3831-48eb-8a28-648be6ec3b08-operator-scripts\") pod \"barbican-a108-account-create-update-5gp8l\" (UID: \"f6df98a0-3831-48eb-8a28-648be6ec3b08\") " pod="openstack/barbican-a108-account-create-update-5gp8l" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.146671 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd84fa06-5321-450f-b602-7f09d571a6d6-operator-scripts\") pod \"neutron-db-create-kxwws\" (UID: \"cd84fa06-5321-450f-b602-7f09d571a6d6\") " pod="openstack/neutron-db-create-kxwws" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.155026 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.155218 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fcac-account-create-update-xwfk2" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.162317 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.162536 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.162563 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bnktl" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.162719 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.169177 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-t4kgb"] Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.207389 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sffj8\" (UniqueName: \"kubernetes.io/projected/cd84fa06-5321-450f-b602-7f09d571a6d6-kube-api-access-sffj8\") pod \"neutron-db-create-kxwws\" (UID: \"cd84fa06-5321-450f-b602-7f09d571a6d6\") " pod="openstack/neutron-db-create-kxwws" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.242823 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsxp5\" (UniqueName: \"kubernetes.io/projected/f6df98a0-3831-48eb-8a28-648be6ec3b08-kube-api-access-qsxp5\") pod \"barbican-a108-account-create-update-5gp8l\" (UID: \"f6df98a0-3831-48eb-8a28-648be6ec3b08\") " pod="openstack/barbican-a108-account-create-update-5gp8l" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.248317 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-combined-ca-bundle\") pod \"keystone-db-sync-t4kgb\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.248612 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rks6\" (UniqueName: \"kubernetes.io/projected/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-kube-api-access-6rks6\") pod \"keystone-db-sync-t4kgb\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.248831 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-config-data\") pod \"keystone-db-sync-t4kgb\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.285046 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6ff9-account-create-update-cqjmd"] Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.296498 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ff9-account-create-update-cqjmd" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.303629 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.335778 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ff9-account-create-update-cqjmd"] Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.353986 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-combined-ca-bundle\") pod \"keystone-db-sync-t4kgb\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.354035 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rks6\" (UniqueName: \"kubernetes.io/projected/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-kube-api-access-6rks6\") pod \"keystone-db-sync-t4kgb\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.354081 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cx5m\" (UniqueName: \"kubernetes.io/projected/c5c13b61-d034-4dfe-9632-f11b4210eba1-kube-api-access-4cx5m\") pod \"neutron-6ff9-account-create-update-cqjmd\" (UID: \"c5c13b61-d034-4dfe-9632-f11b4210eba1\") " pod="openstack/neutron-6ff9-account-create-update-cqjmd" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.354108 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-config-data\") pod \"keystone-db-sync-t4kgb\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.354179 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c13b61-d034-4dfe-9632-f11b4210eba1-operator-scripts\") pod \"neutron-6ff9-account-create-update-cqjmd\" (UID: \"c5c13b61-d034-4dfe-9632-f11b4210eba1\") " pod="openstack/neutron-6ff9-account-create-update-cqjmd" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.371544 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kxwws" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.416594 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rks6\" (UniqueName: \"kubernetes.io/projected/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-kube-api-access-6rks6\") pod \"keystone-db-sync-t4kgb\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.461706 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-config-data\") pod \"keystone-db-sync-t4kgb\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.462338 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-combined-ca-bundle\") pod \"keystone-db-sync-t4kgb\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.471152 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cx5m\" (UniqueName: \"kubernetes.io/projected/c5c13b61-d034-4dfe-9632-f11b4210eba1-kube-api-access-4cx5m\") pod \"neutron-6ff9-account-create-update-cqjmd\" (UID: \"c5c13b61-d034-4dfe-9632-f11b4210eba1\") " pod="openstack/neutron-6ff9-account-create-update-cqjmd" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.471801 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c13b61-d034-4dfe-9632-f11b4210eba1-operator-scripts\") pod \"neutron-6ff9-account-create-update-cqjmd\" (UID: \"c5c13b61-d034-4dfe-9632-f11b4210eba1\") " pod="openstack/neutron-6ff9-account-create-update-cqjmd" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.473002 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c13b61-d034-4dfe-9632-f11b4210eba1-operator-scripts\") pod \"neutron-6ff9-account-create-update-cqjmd\" (UID: \"c5c13b61-d034-4dfe-9632-f11b4210eba1\") " pod="openstack/neutron-6ff9-account-create-update-cqjmd" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.528614 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cx5m\" (UniqueName: \"kubernetes.io/projected/c5c13b61-d034-4dfe-9632-f11b4210eba1-kube-api-access-4cx5m\") pod \"neutron-6ff9-account-create-update-cqjmd\" (UID: \"c5c13b61-d034-4dfe-9632-f11b4210eba1\") " pod="openstack/neutron-6ff9-account-create-update-cqjmd" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.543384 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a108-account-create-update-5gp8l" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.671374 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.786683 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ff9-account-create-update-cqjmd" Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.844516 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"db822240c7702c5814440d552724987dc75a9093ab1681ff2dabc95d41400124"} Dec 17 09:23:38 crc kubenswrapper[4935]: I1217 09:23:38.909648 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-z2qrd"] Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.307126 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-fcac-account-create-update-xwfk2"] Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.495801 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-kxwws"] Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.597995 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-4rx9f"] Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.626515 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-a108-account-create-update-5gp8l"] Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.633996 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6ff9-account-create-update-cqjmd"] Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.753356 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-t4kgb"] Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.853919 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-t4kgb" event={"ID":"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41","Type":"ContainerStarted","Data":"192e2adf03de52ef30178394e279f37ad75ce6f027a87ba84b9aaf181391deed"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.857066 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bmvch" event={"ID":"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07","Type":"ContainerStarted","Data":"fef2d06d27c63d666220b1ee7bf5e444341ac9e38fdb62d31d0790e181779c06"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.867110 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-kxwws" event={"ID":"cd84fa06-5321-450f-b602-7f09d571a6d6","Type":"ContainerStarted","Data":"1076f5831d0a3aff2b509cfde50d315cc968217dc951ad7f7607fee47df033d7"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.870260 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4rx9f" event={"ID":"9cb24d5e-88f0-4613-9cad-16f2f3717cae","Type":"ContainerStarted","Data":"a847e3849289fa1748c0f5b04536a5655559254b5861be27b8ae94451e14254c"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.879832 4935 generic.go:334] "Generic (PLEG): container finished" podID="454cfd39-9bc9-4e7e-968b-7f4b67654fb0" containerID="992f1cb705bceb8a16a4ad9543ea505a37b782e1c7c3919c4f0c33ee631cfcd4" exitCode=0 Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.879916 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z2qrd" event={"ID":"454cfd39-9bc9-4e7e-968b-7f4b67654fb0","Type":"ContainerDied","Data":"992f1cb705bceb8a16a4ad9543ea505a37b782e1c7c3919c4f0c33ee631cfcd4"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.879951 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z2qrd" event={"ID":"454cfd39-9bc9-4e7e-968b-7f4b67654fb0","Type":"ContainerStarted","Data":"5baf8fa9a5fb1ee44418f675e2ecdb4a1298443cc08024175a48d44f6817ba3c"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.880743 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-bmvch" podStartSLOduration=4.369890106 podStartE2EDuration="19.880721144s" podCreationTimestamp="2025-12-17 09:23:20 +0000 UTC" firstStartedPulling="2025-12-17 09:23:21.518652184 +0000 UTC m=+1121.178492947" lastFinishedPulling="2025-12-17 09:23:37.029483222 +0000 UTC m=+1136.689323985" observedRunningTime="2025-12-17 09:23:39.877460684 +0000 UTC m=+1139.537301447" watchObservedRunningTime="2025-12-17 09:23:39.880721144 +0000 UTC m=+1139.540561907" Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.890892 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a108-account-create-update-5gp8l" event={"ID":"f6df98a0-3831-48eb-8a28-648be6ec3b08","Type":"ContainerStarted","Data":"59b7ceb83777586d5c4fe9b22b1ca4491ecbc27afd5983626f6f956999c72c16"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.901095 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fcac-account-create-update-xwfk2" event={"ID":"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1","Type":"ContainerStarted","Data":"db1f7c1e0b5bcee64b60dccb81d4b73c80fe92ee9a37075949034bb6d429ce1b"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.901159 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fcac-account-create-update-xwfk2" event={"ID":"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1","Type":"ContainerStarted","Data":"9a8651e3575743e27bb361af00933e8e226ff1def38c277f956802bc5b9a6792"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.909318 4935 generic.go:334] "Generic (PLEG): container finished" podID="1ec63236-123e-444d-b79c-4b0bb2dba193" containerID="369e0278087dc0bd0f384b1fa90f4ab44f3c28c60b52cbac9c4d94759ed8c3c1" exitCode=0 Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.909465 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l4zph-config-w9tq7" event={"ID":"1ec63236-123e-444d-b79c-4b0bb2dba193","Type":"ContainerDied","Data":"369e0278087dc0bd0f384b1fa90f4ab44f3c28c60b52cbac9c4d94759ed8c3c1"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.917418 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ff9-account-create-update-cqjmd" event={"ID":"c5c13b61-d034-4dfe-9632-f11b4210eba1","Type":"ContainerStarted","Data":"ea8bdf815539a93e9b33fe871022a3b74e29b86c987c79cd420ad88bcafcf459"} Dec 17 09:23:39 crc kubenswrapper[4935]: I1217 09:23:39.931388 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-fcac-account-create-update-xwfk2" podStartSLOduration=2.93136809 podStartE2EDuration="2.93136809s" podCreationTimestamp="2025-12-17 09:23:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:23:39.930300825 +0000 UTC m=+1139.590141588" watchObservedRunningTime="2025-12-17 09:23:39.93136809 +0000 UTC m=+1139.591208853" Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.292129 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-l4zph" Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.928670 4935 generic.go:334] "Generic (PLEG): container finished" podID="cd84fa06-5321-450f-b602-7f09d571a6d6" containerID="4a349b57feb6c668a7410d3d6c44992ef03228947d63e42323ca94794ebf3dbf" exitCode=0 Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.928872 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-kxwws" event={"ID":"cd84fa06-5321-450f-b602-7f09d571a6d6","Type":"ContainerDied","Data":"4a349b57feb6c668a7410d3d6c44992ef03228947d63e42323ca94794ebf3dbf"} Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.932473 4935 generic.go:334] "Generic (PLEG): container finished" podID="c5c13b61-d034-4dfe-9632-f11b4210eba1" containerID="1e8eddfda05e83a1c6823a74e4e06b33e6d2ee020da47df9a5df12ef153c46e8" exitCode=0 Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.932604 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ff9-account-create-update-cqjmd" event={"ID":"c5c13b61-d034-4dfe-9632-f11b4210eba1","Type":"ContainerDied","Data":"1e8eddfda05e83a1c6823a74e4e06b33e6d2ee020da47df9a5df12ef153c46e8"} Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.934542 4935 generic.go:334] "Generic (PLEG): container finished" podID="9cb24d5e-88f0-4613-9cad-16f2f3717cae" containerID="89b2b722e268666b221053c8920c32f79cc1f3d4cf582f40887173888baf4949" exitCode=0 Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.934633 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4rx9f" event={"ID":"9cb24d5e-88f0-4613-9cad-16f2f3717cae","Type":"ContainerDied","Data":"89b2b722e268666b221053c8920c32f79cc1f3d4cf582f40887173888baf4949"} Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.941581 4935 generic.go:334] "Generic (PLEG): container finished" podID="f6df98a0-3831-48eb-8a28-648be6ec3b08" containerID="5d51f5eb1cd3f3515acbdb9ece5eb9c266c47c43dcda7661a8f14162908c2979" exitCode=0 Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.941803 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a108-account-create-update-5gp8l" event={"ID":"f6df98a0-3831-48eb-8a28-648be6ec3b08","Type":"ContainerDied","Data":"5d51f5eb1cd3f3515acbdb9ece5eb9c266c47c43dcda7661a8f14162908c2979"} Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.952040 4935 generic.go:334] "Generic (PLEG): container finished" podID="7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1" containerID="db1f7c1e0b5bcee64b60dccb81d4b73c80fe92ee9a37075949034bb6d429ce1b" exitCode=0 Dec 17 09:23:40 crc kubenswrapper[4935]: I1217 09:23:40.952162 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fcac-account-create-update-xwfk2" event={"ID":"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1","Type":"ContainerDied","Data":"db1f7c1e0b5bcee64b60dccb81d4b73c80fe92ee9a37075949034bb6d429ce1b"} Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.455521 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z2qrd" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.568130 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-operator-scripts\") pod \"454cfd39-9bc9-4e7e-968b-7f4b67654fb0\" (UID: \"454cfd39-9bc9-4e7e-968b-7f4b67654fb0\") " Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.568232 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8v78\" (UniqueName: \"kubernetes.io/projected/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-kube-api-access-g8v78\") pod \"454cfd39-9bc9-4e7e-968b-7f4b67654fb0\" (UID: \"454cfd39-9bc9-4e7e-968b-7f4b67654fb0\") " Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.574018 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "454cfd39-9bc9-4e7e-968b-7f4b67654fb0" (UID: "454cfd39-9bc9-4e7e-968b-7f4b67654fb0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.583497 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-kube-api-access-g8v78" (OuterVolumeSpecName: "kube-api-access-g8v78") pod "454cfd39-9bc9-4e7e-968b-7f4b67654fb0" (UID: "454cfd39-9bc9-4e7e-968b-7f4b67654fb0"). InnerVolumeSpecName "kube-api-access-g8v78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.670197 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.670231 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8v78\" (UniqueName: \"kubernetes.io/projected/454cfd39-9bc9-4e7e-968b-7f4b67654fb0-kube-api-access-g8v78\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.681055 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.772190 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92jwf\" (UniqueName: \"kubernetes.io/projected/1ec63236-123e-444d-b79c-4b0bb2dba193-kube-api-access-92jwf\") pod \"1ec63236-123e-444d-b79c-4b0bb2dba193\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.772264 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run\") pod \"1ec63236-123e-444d-b79c-4b0bb2dba193\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.772366 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run" (OuterVolumeSpecName: "var-run") pod "1ec63236-123e-444d-b79c-4b0bb2dba193" (UID: "1ec63236-123e-444d-b79c-4b0bb2dba193"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.772420 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run-ovn\") pod \"1ec63236-123e-444d-b79c-4b0bb2dba193\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.772455 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1ec63236-123e-444d-b79c-4b0bb2dba193" (UID: "1ec63236-123e-444d-b79c-4b0bb2dba193"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.772510 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-log-ovn\") pod \"1ec63236-123e-444d-b79c-4b0bb2dba193\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.772588 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1ec63236-123e-444d-b79c-4b0bb2dba193" (UID: "1ec63236-123e-444d-b79c-4b0bb2dba193"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.772682 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-additional-scripts\") pod \"1ec63236-123e-444d-b79c-4b0bb2dba193\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.772839 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-scripts\") pod \"1ec63236-123e-444d-b79c-4b0bb2dba193\" (UID: \"1ec63236-123e-444d-b79c-4b0bb2dba193\") " Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.774144 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-scripts" (OuterVolumeSpecName: "scripts") pod "1ec63236-123e-444d-b79c-4b0bb2dba193" (UID: "1ec63236-123e-444d-b79c-4b0bb2dba193"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.776242 4935 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.776328 4935 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.776344 4935 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1ec63236-123e-444d-b79c-4b0bb2dba193-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.776357 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.777504 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1ec63236-123e-444d-b79c-4b0bb2dba193" (UID: "1ec63236-123e-444d-b79c-4b0bb2dba193"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.782750 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ec63236-123e-444d-b79c-4b0bb2dba193-kube-api-access-92jwf" (OuterVolumeSpecName: "kube-api-access-92jwf") pod "1ec63236-123e-444d-b79c-4b0bb2dba193" (UID: "1ec63236-123e-444d-b79c-4b0bb2dba193"). InnerVolumeSpecName "kube-api-access-92jwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.878199 4935 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1ec63236-123e-444d-b79c-4b0bb2dba193-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.878565 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92jwf\" (UniqueName: \"kubernetes.io/projected/1ec63236-123e-444d-b79c-4b0bb2dba193-kube-api-access-92jwf\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.962404 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"b0431a7b5653af64562dff6ec08ba48c88059d302724a69605f29e8d59b6dc79"} Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.962468 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"d7c32d9d25a6c6d982288fb3c95afbddc58338732f05a8a7d8311fd9c49103ea"} Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.964951 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z2qrd" event={"ID":"454cfd39-9bc9-4e7e-968b-7f4b67654fb0","Type":"ContainerDied","Data":"5baf8fa9a5fb1ee44418f675e2ecdb4a1298443cc08024175a48d44f6817ba3c"} Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.965013 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z2qrd" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.968195 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5baf8fa9a5fb1ee44418f675e2ecdb4a1298443cc08024175a48d44f6817ba3c" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.973216 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-l4zph-config-w9tq7" Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.973256 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-l4zph-config-w9tq7" event={"ID":"1ec63236-123e-444d-b79c-4b0bb2dba193","Type":"ContainerDied","Data":"9ed0c749785657a69b8e5db53f614aee97166039f4622907266e0a030be383e7"} Dec 17 09:23:41 crc kubenswrapper[4935]: I1217 09:23:41.973311 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ed0c749785657a69b8e5db53f614aee97166039f4622907266e0a030be383e7" Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.295505 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kxwws" Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.396185 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd84fa06-5321-450f-b602-7f09d571a6d6-operator-scripts\") pod \"cd84fa06-5321-450f-b602-7f09d571a6d6\" (UID: \"cd84fa06-5321-450f-b602-7f09d571a6d6\") " Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.396625 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sffj8\" (UniqueName: \"kubernetes.io/projected/cd84fa06-5321-450f-b602-7f09d571a6d6-kube-api-access-sffj8\") pod \"cd84fa06-5321-450f-b602-7f09d571a6d6\" (UID: \"cd84fa06-5321-450f-b602-7f09d571a6d6\") " Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.396811 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd84fa06-5321-450f-b602-7f09d571a6d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd84fa06-5321-450f-b602-7f09d571a6d6" (UID: "cd84fa06-5321-450f-b602-7f09d571a6d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.397725 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd84fa06-5321-450f-b602-7f09d571a6d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.414006 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd84fa06-5321-450f-b602-7f09d571a6d6-kube-api-access-sffj8" (OuterVolumeSpecName: "kube-api-access-sffj8") pod "cd84fa06-5321-450f-b602-7f09d571a6d6" (UID: "cd84fa06-5321-450f-b602-7f09d571a6d6"). InnerVolumeSpecName "kube-api-access-sffj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.499596 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sffj8\" (UniqueName: \"kubernetes.io/projected/cd84fa06-5321-450f-b602-7f09d571a6d6-kube-api-access-sffj8\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.824114 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-l4zph-config-w9tq7"] Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.833091 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-l4zph-config-w9tq7"] Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.986612 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-4rx9f" event={"ID":"9cb24d5e-88f0-4613-9cad-16f2f3717cae","Type":"ContainerDied","Data":"a847e3849289fa1748c0f5b04536a5655559254b5861be27b8ae94451e14254c"} Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.986682 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a847e3849289fa1748c0f5b04536a5655559254b5861be27b8ae94451e14254c" Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.988022 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-a108-account-create-update-5gp8l" event={"ID":"f6df98a0-3831-48eb-8a28-648be6ec3b08","Type":"ContainerDied","Data":"59b7ceb83777586d5c4fe9b22b1ca4491ecbc27afd5983626f6f956999c72c16"} Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.988044 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59b7ceb83777586d5c4fe9b22b1ca4491ecbc27afd5983626f6f956999c72c16" Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.999389 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-fcac-account-create-update-xwfk2" event={"ID":"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1","Type":"ContainerDied","Data":"9a8651e3575743e27bb361af00933e8e226ff1def38c277f956802bc5b9a6792"} Dec 17 09:23:42 crc kubenswrapper[4935]: I1217 09:23:42.999451 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a8651e3575743e27bb361af00933e8e226ff1def38c277f956802bc5b9a6792" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.005341 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"785724f5f63364516999cf6752a3956987ccc6876c146dc8b12796fa6d4d0b24"} Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.005410 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"e59304056bfe68688f3e6c32a18ba0fc8e8117ef0b4ed8d558e2ba7678d2cce1"} Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.011346 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-kxwws" event={"ID":"cd84fa06-5321-450f-b602-7f09d571a6d6","Type":"ContainerDied","Data":"1076f5831d0a3aff2b509cfde50d315cc968217dc951ad7f7607fee47df033d7"} Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.011416 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1076f5831d0a3aff2b509cfde50d315cc968217dc951ad7f7607fee47df033d7" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.011565 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-kxwws" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.016469 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6ff9-account-create-update-cqjmd" event={"ID":"c5c13b61-d034-4dfe-9632-f11b4210eba1","Type":"ContainerDied","Data":"ea8bdf815539a93e9b33fe871022a3b74e29b86c987c79cd420ad88bcafcf459"} Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.016540 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea8bdf815539a93e9b33fe871022a3b74e29b86c987c79cd420ad88bcafcf459" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.041457 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a108-account-create-update-5gp8l" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.048014 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fcac-account-create-update-xwfk2" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.054747 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4rx9f" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.081261 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ff9-account-create-update-cqjmd" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.117813 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbpfz\" (UniqueName: \"kubernetes.io/projected/9cb24d5e-88f0-4613-9cad-16f2f3717cae-kube-api-access-lbpfz\") pod \"9cb24d5e-88f0-4613-9cad-16f2f3717cae\" (UID: \"9cb24d5e-88f0-4613-9cad-16f2f3717cae\") " Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.117889 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cb24d5e-88f0-4613-9cad-16f2f3717cae-operator-scripts\") pod \"9cb24d5e-88f0-4613-9cad-16f2f3717cae\" (UID: \"9cb24d5e-88f0-4613-9cad-16f2f3717cae\") " Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.117922 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-operator-scripts\") pod \"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1\" (UID: \"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1\") " Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.118025 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvdzk\" (UniqueName: \"kubernetes.io/projected/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-kube-api-access-qvdzk\") pod \"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1\" (UID: \"7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1\") " Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.118069 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsxp5\" (UniqueName: \"kubernetes.io/projected/f6df98a0-3831-48eb-8a28-648be6ec3b08-kube-api-access-qsxp5\") pod \"f6df98a0-3831-48eb-8a28-648be6ec3b08\" (UID: \"f6df98a0-3831-48eb-8a28-648be6ec3b08\") " Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.118149 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6df98a0-3831-48eb-8a28-648be6ec3b08-operator-scripts\") pod \"f6df98a0-3831-48eb-8a28-648be6ec3b08\" (UID: \"f6df98a0-3831-48eb-8a28-648be6ec3b08\") " Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.120071 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1" (UID: "7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.120791 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6df98a0-3831-48eb-8a28-648be6ec3b08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f6df98a0-3831-48eb-8a28-648be6ec3b08" (UID: "f6df98a0-3831-48eb-8a28-648be6ec3b08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.124250 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cb24d5e-88f0-4613-9cad-16f2f3717cae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9cb24d5e-88f0-4613-9cad-16f2f3717cae" (UID: "9cb24d5e-88f0-4613-9cad-16f2f3717cae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.125844 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb24d5e-88f0-4613-9cad-16f2f3717cae-kube-api-access-lbpfz" (OuterVolumeSpecName: "kube-api-access-lbpfz") pod "9cb24d5e-88f0-4613-9cad-16f2f3717cae" (UID: "9cb24d5e-88f0-4613-9cad-16f2f3717cae"). InnerVolumeSpecName "kube-api-access-lbpfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.127732 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6df98a0-3831-48eb-8a28-648be6ec3b08-kube-api-access-qsxp5" (OuterVolumeSpecName: "kube-api-access-qsxp5") pod "f6df98a0-3831-48eb-8a28-648be6ec3b08" (UID: "f6df98a0-3831-48eb-8a28-648be6ec3b08"). InnerVolumeSpecName "kube-api-access-qsxp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.128940 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-kube-api-access-qvdzk" (OuterVolumeSpecName: "kube-api-access-qvdzk") pod "7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1" (UID: "7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1"). InnerVolumeSpecName "kube-api-access-qvdzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.152818 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ec63236-123e-444d-b79c-4b0bb2dba193" path="/var/lib/kubelet/pods/1ec63236-123e-444d-b79c-4b0bb2dba193/volumes" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.220258 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cx5m\" (UniqueName: \"kubernetes.io/projected/c5c13b61-d034-4dfe-9632-f11b4210eba1-kube-api-access-4cx5m\") pod \"c5c13b61-d034-4dfe-9632-f11b4210eba1\" (UID: \"c5c13b61-d034-4dfe-9632-f11b4210eba1\") " Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.220480 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c13b61-d034-4dfe-9632-f11b4210eba1-operator-scripts\") pod \"c5c13b61-d034-4dfe-9632-f11b4210eba1\" (UID: \"c5c13b61-d034-4dfe-9632-f11b4210eba1\") " Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.221041 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f6df98a0-3831-48eb-8a28-648be6ec3b08-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.221062 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbpfz\" (UniqueName: \"kubernetes.io/projected/9cb24d5e-88f0-4613-9cad-16f2f3717cae-kube-api-access-lbpfz\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.221077 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cb24d5e-88f0-4613-9cad-16f2f3717cae-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.221087 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.221097 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvdzk\" (UniqueName: \"kubernetes.io/projected/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1-kube-api-access-qvdzk\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.221106 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsxp5\" (UniqueName: \"kubernetes.io/projected/f6df98a0-3831-48eb-8a28-648be6ec3b08-kube-api-access-qsxp5\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.221086 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5c13b61-d034-4dfe-9632-f11b4210eba1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c5c13b61-d034-4dfe-9632-f11b4210eba1" (UID: "c5c13b61-d034-4dfe-9632-f11b4210eba1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.225716 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c13b61-d034-4dfe-9632-f11b4210eba1-kube-api-access-4cx5m" (OuterVolumeSpecName: "kube-api-access-4cx5m") pod "c5c13b61-d034-4dfe-9632-f11b4210eba1" (UID: "c5c13b61-d034-4dfe-9632-f11b4210eba1"). InnerVolumeSpecName "kube-api-access-4cx5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.323441 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c5c13b61-d034-4dfe-9632-f11b4210eba1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:43 crc kubenswrapper[4935]: I1217 09:23:43.323491 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cx5m\" (UniqueName: \"kubernetes.io/projected/c5c13b61-d034-4dfe-9632-f11b4210eba1-kube-api-access-4cx5m\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:44 crc kubenswrapper[4935]: I1217 09:23:44.024339 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-4rx9f" Dec 17 09:23:44 crc kubenswrapper[4935]: I1217 09:23:44.024416 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-a108-account-create-update-5gp8l" Dec 17 09:23:44 crc kubenswrapper[4935]: I1217 09:23:44.024339 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-fcac-account-create-update-xwfk2" Dec 17 09:23:44 crc kubenswrapper[4935]: I1217 09:23:44.024416 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6ff9-account-create-update-cqjmd" Dec 17 09:23:48 crc kubenswrapper[4935]: I1217 09:23:48.071452 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-t4kgb" event={"ID":"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41","Type":"ContainerStarted","Data":"736c36d5b0a3d1dfb37532adde6f6283ab1b4efaee9daf103245b47e311c24b6"} Dec 17 09:23:48 crc kubenswrapper[4935]: I1217 09:23:48.080294 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"d1233463f9e09f8ed68f0fc7e5d3cfcb8e0cdbb1ef68869e76bd42fa6699d5f2"} Dec 17 09:23:48 crc kubenswrapper[4935]: I1217 09:23:48.080343 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"ef70df00388aae220d7b382743b9b9853745595c9d906eb4128538706134ae22"} Dec 17 09:23:48 crc kubenswrapper[4935]: I1217 09:23:48.080353 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"7de97a21137f614c8feef2bdb8c6ec144528036dfed7b7208435cb7a9a34f593"} Dec 17 09:23:48 crc kubenswrapper[4935]: I1217 09:23:48.102758 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-t4kgb" podStartSLOduration=2.6653476449999998 podStartE2EDuration="10.102740448s" podCreationTimestamp="2025-12-17 09:23:38 +0000 UTC" firstStartedPulling="2025-12-17 09:23:39.787109607 +0000 UTC m=+1139.446950370" lastFinishedPulling="2025-12-17 09:23:47.22450241 +0000 UTC m=+1146.884343173" observedRunningTime="2025-12-17 09:23:48.097074179 +0000 UTC m=+1147.756914942" watchObservedRunningTime="2025-12-17 09:23:48.102740448 +0000 UTC m=+1147.762581211" Dec 17 09:23:49 crc kubenswrapper[4935]: I1217 09:23:49.098440 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"8918497c526df4f6665ce01974aa10fa13cdd7f564c177a7f5d91b7862f51c32"} Dec 17 09:23:52 crc kubenswrapper[4935]: I1217 09:23:52.149439 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"e06e6227fd3b076244169b338f29ad16568014046964eb9dbccc3ecfb0ce0799"} Dec 17 09:23:53 crc kubenswrapper[4935]: I1217 09:23:53.161390 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"c7faec761c8090123f94952d94511e75df01f0c89599e1ef8a295363c55e7a2f"} Dec 17 09:23:53 crc kubenswrapper[4935]: I1217 09:23:53.161818 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"f6bc50503a9cb8edc79a0bdb7f14f31b39784c6e4982183286580b2daa155d96"} Dec 17 09:23:53 crc kubenswrapper[4935]: I1217 09:23:53.161829 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"786c8f60f34c73fe65a04f4809a827e6c1864640b02ece0c1b344b43d43e87db"} Dec 17 09:23:53 crc kubenswrapper[4935]: I1217 09:23:53.164541 4935 generic.go:334] "Generic (PLEG): container finished" podID="c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07" containerID="fef2d06d27c63d666220b1ee7bf5e444341ac9e38fdb62d31d0790e181779c06" exitCode=0 Dec 17 09:23:53 crc kubenswrapper[4935]: I1217 09:23:53.164603 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bmvch" event={"ID":"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07","Type":"ContainerDied","Data":"fef2d06d27c63d666220b1ee7bf5e444341ac9e38fdb62d31d0790e181779c06"} Dec 17 09:23:53 crc kubenswrapper[4935]: I1217 09:23:53.167233 4935 generic.go:334] "Generic (PLEG): container finished" podID="d4bd9ed6-70bb-4e14-90f1-4cb7488daf41" containerID="736c36d5b0a3d1dfb37532adde6f6283ab1b4efaee9daf103245b47e311c24b6" exitCode=0 Dec 17 09:23:53 crc kubenswrapper[4935]: I1217 09:23:53.167261 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-t4kgb" event={"ID":"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41","Type":"ContainerDied","Data":"736c36d5b0a3d1dfb37532adde6f6283ab1b4efaee9daf103245b47e311c24b6"} Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.209927 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"6972c11581280303f900e19d3c5ac208931433911bb1117c48af4fea6e044ee0"} Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.210368 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"4a76f111cfe12c714195ca7f6f1d2088c8a55acd8f303e86f711d992e51066f6"} Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.210387 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"a2c97a57-ac68-4fab-acbd-ecdec8db5fb5","Type":"ContainerStarted","Data":"8a7250b8ac2c02e3b645241e0d7bb6b867cc71f295f91567091aca629e26942b"} Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.718987 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=28.966514571 podStartE2EDuration="42.718958737s" podCreationTimestamp="2025-12-17 09:23:12 +0000 UTC" firstStartedPulling="2025-12-17 09:23:37.962435846 +0000 UTC m=+1137.622276599" lastFinishedPulling="2025-12-17 09:23:51.714880002 +0000 UTC m=+1151.374720765" observedRunningTime="2025-12-17 09:23:54.315874743 +0000 UTC m=+1153.975715506" watchObservedRunningTime="2025-12-17 09:23:54.718958737 +0000 UTC m=+1154.378799500" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724171 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-4297h"] Dec 17 09:23:54 crc kubenswrapper[4935]: E1217 09:23:54.724627 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ec63236-123e-444d-b79c-4b0bb2dba193" containerName="ovn-config" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724654 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ec63236-123e-444d-b79c-4b0bb2dba193" containerName="ovn-config" Dec 17 09:23:54 crc kubenswrapper[4935]: E1217 09:23:54.724668 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1" containerName="mariadb-account-create-update" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724676 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1" containerName="mariadb-account-create-update" Dec 17 09:23:54 crc kubenswrapper[4935]: E1217 09:23:54.724693 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6df98a0-3831-48eb-8a28-648be6ec3b08" containerName="mariadb-account-create-update" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724700 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6df98a0-3831-48eb-8a28-648be6ec3b08" containerName="mariadb-account-create-update" Dec 17 09:23:54 crc kubenswrapper[4935]: E1217 09:23:54.724708 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cb24d5e-88f0-4613-9cad-16f2f3717cae" containerName="mariadb-database-create" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724714 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb24d5e-88f0-4613-9cad-16f2f3717cae" containerName="mariadb-database-create" Dec 17 09:23:54 crc kubenswrapper[4935]: E1217 09:23:54.724726 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="454cfd39-9bc9-4e7e-968b-7f4b67654fb0" containerName="mariadb-database-create" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724731 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="454cfd39-9bc9-4e7e-968b-7f4b67654fb0" containerName="mariadb-database-create" Dec 17 09:23:54 crc kubenswrapper[4935]: E1217 09:23:54.724745 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd84fa06-5321-450f-b602-7f09d571a6d6" containerName="mariadb-database-create" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724751 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd84fa06-5321-450f-b602-7f09d571a6d6" containerName="mariadb-database-create" Dec 17 09:23:54 crc kubenswrapper[4935]: E1217 09:23:54.724775 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5c13b61-d034-4dfe-9632-f11b4210eba1" containerName="mariadb-account-create-update" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724783 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c13b61-d034-4dfe-9632-f11b4210eba1" containerName="mariadb-account-create-update" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724950 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="454cfd39-9bc9-4e7e-968b-7f4b67654fb0" containerName="mariadb-database-create" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724965 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cb24d5e-88f0-4613-9cad-16f2f3717cae" containerName="mariadb-database-create" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724976 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ec63236-123e-444d-b79c-4b0bb2dba193" containerName="ovn-config" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724985 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1" containerName="mariadb-account-create-update" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.724998 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd84fa06-5321-450f-b602-7f09d571a6d6" containerName="mariadb-database-create" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.725007 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6df98a0-3831-48eb-8a28-648be6ec3b08" containerName="mariadb-account-create-update" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.725015 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5c13b61-d034-4dfe-9632-f11b4210eba1" containerName="mariadb-account-create-update" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.726020 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.732457 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.739873 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-4297h"] Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.844399 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq972\" (UniqueName: \"kubernetes.io/projected/87f3771e-de24-4127-ad5b-9dbb32b32095-kube-api-access-pq972\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.844456 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-config\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.844504 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.844528 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.844583 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.844618 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.930041 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.946594 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq972\" (UniqueName: \"kubernetes.io/projected/87f3771e-de24-4127-ad5b-9dbb32b32095-kube-api-access-pq972\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.946770 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-config\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.946980 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.947026 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.947358 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.947503 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.948161 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-config\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.948606 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.949060 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.949343 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.949704 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:54 crc kubenswrapper[4935]: I1217 09:23:54.974642 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq972\" (UniqueName: \"kubernetes.io/projected/87f3771e-de24-4127-ad5b-9dbb32b32095-kube-api-access-pq972\") pod \"dnsmasq-dns-8467b54bcc-4297h\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.048490 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-combined-ca-bundle\") pod \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.049228 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rks6\" (UniqueName: \"kubernetes.io/projected/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-kube-api-access-6rks6\") pod \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.049511 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-config-data\") pod \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\" (UID: \"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41\") " Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.054828 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.058045 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-kube-api-access-6rks6" (OuterVolumeSpecName: "kube-api-access-6rks6") pod "d4bd9ed6-70bb-4e14-90f1-4cb7488daf41" (UID: "d4bd9ed6-70bb-4e14-90f1-4cb7488daf41"). InnerVolumeSpecName "kube-api-access-6rks6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.083545 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4bd9ed6-70bb-4e14-90f1-4cb7488daf41" (UID: "d4bd9ed6-70bb-4e14-90f1-4cb7488daf41"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.106115 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-config-data" (OuterVolumeSpecName: "config-data") pod "d4bd9ed6-70bb-4e14-90f1-4cb7488daf41" (UID: "d4bd9ed6-70bb-4e14-90f1-4cb7488daf41"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.125488 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.152865 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rks6\" (UniqueName: \"kubernetes.io/projected/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-kube-api-access-6rks6\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.152893 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.152905 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.224486 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-t4kgb" event={"ID":"d4bd9ed6-70bb-4e14-90f1-4cb7488daf41","Type":"ContainerDied","Data":"192e2adf03de52ef30178394e279f37ad75ce6f027a87ba84b9aaf181391deed"} Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.224535 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="192e2adf03de52ef30178394e279f37ad75ce6f027a87ba84b9aaf181391deed" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.224605 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-t4kgb" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.227082 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-bmvch" event={"ID":"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07","Type":"ContainerDied","Data":"96dfd812d9079f6e9ed5203ea424357401a538ad4ee7e87fd928c2b2053ae6d7"} Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.227128 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96dfd812d9079f6e9ed5203ea424357401a538ad4ee7e87fd928c2b2053ae6d7" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.227145 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-bmvch" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.253779 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4svc5\" (UniqueName: \"kubernetes.io/projected/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-kube-api-access-4svc5\") pod \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.253859 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-db-sync-config-data\") pod \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.253970 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-config-data\") pod \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.254082 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-combined-ca-bundle\") pod \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\" (UID: \"c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07\") " Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.260931 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07" (UID: "c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.261480 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-kube-api-access-4svc5" (OuterVolumeSpecName: "kube-api-access-4svc5") pod "c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07" (UID: "c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07"). InnerVolumeSpecName "kube-api-access-4svc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.314007 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07" (UID: "c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.364111 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-config-data" (OuterVolumeSpecName: "config-data") pod "c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07" (UID: "c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.377060 4935 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.380091 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.380130 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.380145 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4svc5\" (UniqueName: \"kubernetes.io/projected/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07-kube-api-access-4svc5\") on node \"crc\" DevicePath \"\"" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.584171 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gsvpp"] Dec 17 09:23:55 crc kubenswrapper[4935]: E1217 09:23:55.584876 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07" containerName="glance-db-sync" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.584895 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07" containerName="glance-db-sync" Dec 17 09:23:55 crc kubenswrapper[4935]: E1217 09:23:55.584919 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4bd9ed6-70bb-4e14-90f1-4cb7488daf41" containerName="keystone-db-sync" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.584927 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4bd9ed6-70bb-4e14-90f1-4cb7488daf41" containerName="keystone-db-sync" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.585105 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4bd9ed6-70bb-4e14-90f1-4cb7488daf41" containerName="keystone-db-sync" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.585137 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07" containerName="glance-db-sync" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.586521 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.590458 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.600737 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.601042 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.607986 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.611855 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bnktl" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.656195 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gsvpp"] Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.692901 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5pmt\" (UniqueName: \"kubernetes.io/projected/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-kube-api-access-m5pmt\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.692960 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-credential-keys\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.700339 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-scripts\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.700417 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-config-data\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.700460 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-fernet-keys\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.700501 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-combined-ca-bundle\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.719914 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-4297h"] Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.800663 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-4297h"] Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.803059 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-scripts\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.803107 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-config-data\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.803135 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-fernet-keys\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.803156 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-combined-ca-bundle\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.803253 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5pmt\" (UniqueName: \"kubernetes.io/projected/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-kube-api-access-m5pmt\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.803292 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-credential-keys\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.817830 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-scripts\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.818161 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-credential-keys\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.818884 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-fernet-keys\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.821192 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-config-data\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.821927 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-combined-ca-bundle\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.905989 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-n57wl"] Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.907626 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.934224 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5pmt\" (UniqueName: \"kubernetes.io/projected/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-kube-api-access-m5pmt\") pod \"keystone-bootstrap-gsvpp\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:55 crc kubenswrapper[4935]: I1217 09:23:55.990844 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-n57wl"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.009010 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-svc\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.009093 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-config\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.009121 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4br2\" (UniqueName: \"kubernetes.io/projected/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-kube-api-access-j4br2\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.009177 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-swift-storage-0\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.009221 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-nb\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.009307 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-sb\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.092092 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-64f667bdd5-px4gj"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.093743 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.113577 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-swift-storage-0\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.113666 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-nb\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.113761 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-sb\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.115030 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-nb\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.117977 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-swift-storage-0\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.118918 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-sb\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.119040 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-svc\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.119135 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-config\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.119179 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4br2\" (UniqueName: \"kubernetes.io/projected/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-kube-api-access-j4br2\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.123638 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-64f667bdd5-px4gj"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.127059 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.129804 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.130352 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-7wkdf" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.131413 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-svc\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.132709 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.134534 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-config\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.203511 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-g8ctb"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.205260 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.221093 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.225511 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-config-data\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.225585 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-logs\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.225674 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-horizon-secret-key\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.225734 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpptj\" (UniqueName: \"kubernetes.io/projected/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-kube-api-access-cpptj\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.225776 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-scripts\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.228107 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4br2\" (UniqueName: \"kubernetes.io/projected/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-kube-api-access-j4br2\") pod \"dnsmasq-dns-58647bbf65-n57wl\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.258575 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-4297h" event={"ID":"87f3771e-de24-4127-ad5b-9dbb32b32095","Type":"ContainerStarted","Data":"9e569cddeb697d54d03f99922ad713c4008b966aedc91cd7f11eea7ed4e391d3"} Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.283247 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-45j9c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.283537 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.284038 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.317578 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g8ctb"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.331849 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-config-data\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.331952 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-logs\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.332035 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-horizon-secret-key\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.332089 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-config\") pod \"neutron-db-sync-g8ctb\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.332126 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc4qj\" (UniqueName: \"kubernetes.io/projected/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-kube-api-access-vc4qj\") pod \"neutron-db-sync-g8ctb\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.332169 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpptj\" (UniqueName: \"kubernetes.io/projected/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-kube-api-access-cpptj\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.332221 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-scripts\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.332311 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-combined-ca-bundle\") pod \"neutron-db-sync-g8ctb\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.334158 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-logs\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.334167 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-config-data\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.335103 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-scripts\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.355925 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-horizon-secret-key\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.356245 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.430927 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpptj\" (UniqueName: \"kubernetes.io/projected/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-kube-api-access-cpptj\") pod \"horizon-64f667bdd5-px4gj\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.432866 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-f7nmq"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.433517 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-combined-ca-bundle\") pod \"neutron-db-sync-g8ctb\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.433631 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-config\") pod \"neutron-db-sync-g8ctb\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.433655 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc4qj\" (UniqueName: \"kubernetes.io/projected/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-kube-api-access-vc4qj\") pod \"neutron-db-sync-g8ctb\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.434065 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.442903 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-config\") pod \"neutron-db-sync-g8ctb\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.448754 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-combined-ca-bundle\") pod \"neutron-db-sync-g8ctb\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.455857 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m6hmm" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.455968 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.455868 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.485159 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc4qj\" (UniqueName: \"kubernetes.io/projected/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-kube-api-access-vc4qj\") pod \"neutron-db-sync-g8ctb\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.486342 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.497659 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.512343 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f7nmq"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.512787 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.513161 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537426 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9sct\" (UniqueName: \"kubernetes.io/projected/9b17c8be-6039-4aa6-8227-cd2dfc076f77-kube-api-access-x9sct\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537487 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-scripts\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537522 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537541 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-scripts\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537562 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-config-data\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537578 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537616 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-log-httpd\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537635 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plc9n\" (UniqueName: \"kubernetes.io/projected/ee747235-93a8-420f-95b6-232cdf8a3223-kube-api-access-plc9n\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537657 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-db-sync-config-data\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537689 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-combined-ca-bundle\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537710 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-run-httpd\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537738 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b17c8be-6039-4aa6-8227-cd2dfc076f77-etc-machine-id\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.537762 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-config-data\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.540367 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.575385 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639425 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-log-httpd\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639498 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plc9n\" (UniqueName: \"kubernetes.io/projected/ee747235-93a8-420f-95b6-232cdf8a3223-kube-api-access-plc9n\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639539 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-db-sync-config-data\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639580 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-combined-ca-bundle\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639608 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-run-httpd\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639638 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b17c8be-6039-4aa6-8227-cd2dfc076f77-etc-machine-id\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639660 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-config-data\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639697 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9sct\" (UniqueName: \"kubernetes.io/projected/9b17c8be-6039-4aa6-8227-cd2dfc076f77-kube-api-access-x9sct\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639722 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-scripts\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639749 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639768 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-scripts\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639790 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-config-data\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.639808 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.640492 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b17c8be-6039-4aa6-8227-cd2dfc076f77-etc-machine-id\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.640917 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-log-httpd\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.651781 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.652418 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-run-httpd\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.655197 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-config-data\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.655487 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-config-data\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.656969 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-combined-ca-bundle\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.661072 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-scripts\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.661673 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-db-sync-config-data\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.662386 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.663832 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-scripts\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.701589 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-687fb4989-g2n7c"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.703389 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.711986 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9sct\" (UniqueName: \"kubernetes.io/projected/9b17c8be-6039-4aa6-8227-cd2dfc076f77-kube-api-access-x9sct\") pod \"cinder-db-sync-f7nmq\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.720266 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.730393 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plc9n\" (UniqueName: \"kubernetes.io/projected/ee747235-93a8-420f-95b6-232cdf8a3223-kube-api-access-plc9n\") pod \"ceilometer-0\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.742552 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-horizon-secret-key\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.742626 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-config-data\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.742653 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-scripts\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.742678 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-logs\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.742726 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhb7x\" (UniqueName: \"kubernetes.io/projected/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-kube-api-access-hhb7x\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.758348 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-n57wl"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.765848 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.781681 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-rvtvc"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.806566 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.824160 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.824522 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tmblm" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.828358 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-nl2wv"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.839086 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.906899 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-combined-ca-bundle\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.906977 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhb7x\" (UniqueName: \"kubernetes.io/projected/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-kube-api-access-hhb7x\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.907196 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-horizon-secret-key\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.907243 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8dg8\" (UniqueName: \"kubernetes.io/projected/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-kube-api-access-w8dg8\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.907357 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-scripts\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.908747 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-config-data\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.909059 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.910521 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.910978 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-config-data\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.911112 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-scripts\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.911186 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-logs\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.911312 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-logs\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.911415 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-config-data\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.912250 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-scripts\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.912634 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-logs\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.915978 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-horizon-secret-key\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.945417 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.947263 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.951776 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.951999 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.952137 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2lgf4" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.956714 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-687fb4989-g2n7c"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.967393 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-vhjl6"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.967503 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhb7x\" (UniqueName: \"kubernetes.io/projected/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-kube-api-access-hhb7x\") pod \"horizon-687fb4989-g2n7c\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.969041 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.981702 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-nl2wv"] Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.990863 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.991012 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zw5sc" Dec 17 09:23:56 crc kubenswrapper[4935]: I1217 09:23:56.991731 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.013060 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vhjl6"] Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.017703 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-scripts\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.019763 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-db-sync-config-data\") pod \"barbican-db-sync-vhjl6\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.019957 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-logs\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.020054 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-config-data\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.020130 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-combined-ca-bundle\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.020253 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4gz8\" (UniqueName: \"kubernetes.io/projected/a62d0f30-735b-410e-ac80-50a98636ff47-kube-api-access-z4gz8\") pod \"barbican-db-sync-vhjl6\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.020323 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-combined-ca-bundle\") pod \"barbican-db-sync-vhjl6\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.020455 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8dg8\" (UniqueName: \"kubernetes.io/projected/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-kube-api-access-w8dg8\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.020614 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-logs\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.024067 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rvtvc"] Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.033474 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-scripts\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.041646 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.042900 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-combined-ca-bundle\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.043095 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-config-data\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.065808 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8dg8\" (UniqueName: \"kubernetes.io/projected/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-kube-api-access-w8dg8\") pod \"placement-db-sync-rvtvc\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125512 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125549 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125570 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125590 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125612 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125627 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qnwd\" (UniqueName: \"kubernetes.io/projected/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-kube-api-access-2qnwd\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125661 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-logs\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125687 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125709 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4gz8\" (UniqueName: \"kubernetes.io/projected/a62d0f30-735b-410e-ac80-50a98636ff47-kube-api-access-z4gz8\") pod \"barbican-db-sync-vhjl6\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125820 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-combined-ca-bundle\") pod \"barbican-db-sync-vhjl6\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125879 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgqmg\" (UniqueName: \"kubernetes.io/projected/fe048f60-c05c-4561-88bc-42d1b9eecd6c-kube-api-access-sgqmg\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.125902 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-config\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.128201 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.128263 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.128320 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.128445 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-db-sync-config-data\") pod \"barbican-db-sync-vhjl6\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.130156 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-combined-ca-bundle\") pod \"barbican-db-sync-vhjl6\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.132706 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-db-sync-config-data\") pod \"barbican-db-sync-vhjl6\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.156871 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4gz8\" (UniqueName: \"kubernetes.io/projected/a62d0f30-735b-410e-ac80-50a98636ff47-kube-api-access-z4gz8\") pod \"barbican-db-sync-vhjl6\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.168102 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.229961 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230046 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230068 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230085 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230104 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230126 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230141 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qnwd\" (UniqueName: \"kubernetes.io/projected/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-kube-api-access-2qnwd\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230453 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-logs\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230495 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230529 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgqmg\" (UniqueName: \"kubernetes.io/projected/fe048f60-c05c-4561-88bc-42d1b9eecd6c-kube-api-access-sgqmg\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230546 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-config\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230598 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.230622 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.232663 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.233982 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.235647 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.236110 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-logs\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.237033 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.237139 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-config\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.237782 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.238009 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.238135 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.238306 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.250569 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.256110 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgqmg\" (UniqueName: \"kubernetes.io/projected/fe048f60-c05c-4561-88bc-42d1b9eecd6c-kube-api-access-sgqmg\") pod \"dnsmasq-dns-5dc4fcdbc-nl2wv\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.278092 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.316747 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rvtvc" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.416999 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:23:57 crc kubenswrapper[4935]: I1217 09:23:57.974745 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qnwd\" (UniqueName: \"kubernetes.io/projected/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-kube-api-access-2qnwd\") pod \"glance-default-external-api-0\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " pod="openstack/glance-default-external-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.037144 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.124600 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.126689 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.129289 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.171450 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.248239 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.248440 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-logs\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.248666 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.248779 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.248920 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.249126 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78w5w\" (UniqueName: \"kubernetes.io/projected/3ff58534-93f3-4ead-b612-4d851e84d804-kube-api-access-78w5w\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.249199 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.254163 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.351669 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.351754 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.351782 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.351827 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78w5w\" (UniqueName: \"kubernetes.io/projected/3ff58534-93f3-4ead-b612-4d851e84d804-kube-api-access-78w5w\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.351858 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.351905 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.351932 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-logs\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.352937 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.360919 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-logs\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.360942 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.363895 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.380532 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.393751 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.394496 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.405530 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78w5w\" (UniqueName: \"kubernetes.io/projected/3ff58534-93f3-4ead-b612-4d851e84d804-kube-api-access-78w5w\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.406401 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64f667bdd5-px4gj"] Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.410548 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.419221 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.420000 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.428996 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f84dfc49-qqm76"] Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.439634 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.443363 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f84dfc49-qqm76"] Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.559497 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-horizon-secret-key\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.559645 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-config-data\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.559673 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vz4v\" (UniqueName: \"kubernetes.io/projected/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-kube-api-access-2vz4v\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.559706 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-scripts\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.559729 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-logs\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.664528 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-config-data\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.664596 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vz4v\" (UniqueName: \"kubernetes.io/projected/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-kube-api-access-2vz4v\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.664634 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-scripts\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.664661 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-logs\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.664773 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-horizon-secret-key\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.669654 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-logs\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.670116 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-scripts\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.670622 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-config-data\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.673873 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-horizon-secret-key\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.692086 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vz4v\" (UniqueName: \"kubernetes.io/projected/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-kube-api-access-2vz4v\") pod \"horizon-7f84dfc49-qqm76\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:23:58 crc kubenswrapper[4935]: I1217 09:23:58.804456 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:24:00 crc kubenswrapper[4935]: I1217 09:24:00.131140 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:24:00 crc kubenswrapper[4935]: I1217 09:24:00.131795 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:24:01 crc kubenswrapper[4935]: I1217 09:24:01.492242 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-4297h" event={"ID":"87f3771e-de24-4127-ad5b-9dbb32b32095","Type":"ContainerStarted","Data":"adec9dbf411e9d2c74c968890c09feb1ea6e50019622a1586ee6102821b08159"} Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.122110 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.177298 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.185127 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-g8ctb"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.216374 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gsvpp"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.423261 4935 generic.go:334] "Generic (PLEG): container finished" podID="87f3771e-de24-4127-ad5b-9dbb32b32095" containerID="adec9dbf411e9d2c74c968890c09feb1ea6e50019622a1586ee6102821b08159" exitCode=0 Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.423801 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-4297h" event={"ID":"87f3771e-de24-4127-ad5b-9dbb32b32095","Type":"ContainerDied","Data":"adec9dbf411e9d2c74c968890c09feb1ea6e50019622a1586ee6102821b08159"} Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.488236 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8ctb" event={"ID":"99e284fa-5f85-409e-bcb3-fcb2b320a0fe","Type":"ContainerStarted","Data":"114a46d70b4ff70246b57dae562b7a1439293e7cfa68363ceb2ad566e7207730"} Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.489767 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee747235-93a8-420f-95b6-232cdf8a3223","Type":"ContainerStarted","Data":"c6c367380d8db5fd171c79a8671c037dadfcf9a955d7835d97f78c4c8d26a4a1"} Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.497030 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gsvpp" event={"ID":"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f","Type":"ContainerStarted","Data":"b5b40c372729d3da302a15eba2992ceb7f4c853f694761bcf331ce38ddef9e2b"} Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.498162 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c184ea2-862c-40e2-b4a9-9ea6d75b8407","Type":"ContainerStarted","Data":"8c77e949281ff9338d654fe5780f52ec517e82e70c185841087c1b6b466d83f6"} Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.703573 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-vhjl6"] Dec 17 09:24:02 crc kubenswrapper[4935]: W1217 09:24:02.719486 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda62d0f30_735b_410e_ac80_50a98636ff47.slice/crio-146a5d0345e2e6feddae3f48799ff20c4811440e1a611f4126bce480d2675319 WatchSource:0}: Error finding container 146a5d0345e2e6feddae3f48799ff20c4811440e1a611f4126bce480d2675319: Status 404 returned error can't find the container with id 146a5d0345e2e6feddae3f48799ff20c4811440e1a611f4126bce480d2675319 Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.722474 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-n57wl"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.754449 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-687fb4989-g2n7c"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.768364 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-nl2wv"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.786643 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f84dfc49-qqm76"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.794387 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-f7nmq"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.801690 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-rvtvc"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.814349 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64f667bdd5-px4gj"] Dec 17 09:24:02 crc kubenswrapper[4935]: I1217 09:24:02.906256 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:03 crc kubenswrapper[4935]: W1217 09:24:03.169711 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0e28daf_a4e4_4dce_b43a_27a779a7e0e3.slice/crio-85ead9b50b97c80f2f2b340cc9969a06d5e0500f541e8d7f950971d74ea7d46d WatchSource:0}: Error finding container 85ead9b50b97c80f2f2b340cc9969a06d5e0500f541e8d7f950971d74ea7d46d: Status 404 returned error can't find the container with id 85ead9b50b97c80f2f2b340cc9969a06d5e0500f541e8d7f950971d74ea7d46d Dec 17 09:24:03 crc kubenswrapper[4935]: W1217 09:24:03.193142 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b865a47_3ef5_4de0_88f8_4eeaba5de2cc.slice/crio-10a0127f3352bde2295d276bf724d2c1d2fe80a9913c80791827a8053637abc7 WatchSource:0}: Error finding container 10a0127f3352bde2295d276bf724d2c1d2fe80a9913c80791827a8053637abc7: Status 404 returned error can't find the container with id 10a0127f3352bde2295d276bf724d2c1d2fe80a9913c80791827a8053637abc7 Dec 17 09:24:03 crc kubenswrapper[4935]: W1217 09:24:03.244017 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b17c8be_6039_4aa6_8227_cd2dfc076f77.slice/crio-9b25756dd849c95ef48605f4c64fc6e91c80c6ece3efdf637ce80865932b1bb6 WatchSource:0}: Error finding container 9b25756dd849c95ef48605f4c64fc6e91c80c6ece3efdf637ce80865932b1bb6: Status 404 returned error can't find the container with id 9b25756dd849c95ef48605f4c64fc6e91c80c6ece3efdf637ce80865932b1bb6 Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.273260 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.475337 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-nb\") pod \"87f3771e-de24-4127-ad5b-9dbb32b32095\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.475596 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-swift-storage-0\") pod \"87f3771e-de24-4127-ad5b-9dbb32b32095\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.475647 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-svc\") pod \"87f3771e-de24-4127-ad5b-9dbb32b32095\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.475702 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pq972\" (UniqueName: \"kubernetes.io/projected/87f3771e-de24-4127-ad5b-9dbb32b32095-kube-api-access-pq972\") pod \"87f3771e-de24-4127-ad5b-9dbb32b32095\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.475779 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-config\") pod \"87f3771e-de24-4127-ad5b-9dbb32b32095\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.475844 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-sb\") pod \"87f3771e-de24-4127-ad5b-9dbb32b32095\" (UID: \"87f3771e-de24-4127-ad5b-9dbb32b32095\") " Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.494656 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f3771e-de24-4127-ad5b-9dbb32b32095-kube-api-access-pq972" (OuterVolumeSpecName: "kube-api-access-pq972") pod "87f3771e-de24-4127-ad5b-9dbb32b32095" (UID: "87f3771e-de24-4127-ad5b-9dbb32b32095"). InnerVolumeSpecName "kube-api-access-pq972". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.530665 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gsvpp" event={"ID":"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f","Type":"ContainerStarted","Data":"bff53f3f3f28cf1756acd22727f439be8690f1448e706dcfcc2e884c82114623"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.541608 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-config" (OuterVolumeSpecName: "config") pod "87f3771e-de24-4127-ad5b-9dbb32b32095" (UID: "87f3771e-de24-4127-ad5b-9dbb32b32095"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.542717 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58647bbf65-n57wl" event={"ID":"3c56c6f5-2db1-48f5-9190-8a76c2836d9f","Type":"ContainerStarted","Data":"ef57f061a66e38a04dae08ec15b5758f87ceac00864f83cacda9616968661e4b"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.549999 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87f3771e-de24-4127-ad5b-9dbb32b32095" (UID: "87f3771e-de24-4127-ad5b-9dbb32b32095"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.551867 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87f3771e-de24-4127-ad5b-9dbb32b32095" (UID: "87f3771e-de24-4127-ad5b-9dbb32b32095"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.559328 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" event={"ID":"fe048f60-c05c-4561-88bc-42d1b9eecd6c","Type":"ContainerStarted","Data":"c092f40bfeea17a4782fdd221416617988c696650416e92c6f8d0324413851f8"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.563422 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gsvpp" podStartSLOduration=8.5633948 podStartE2EDuration="8.5633948s" podCreationTimestamp="2025-12-17 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:03.561090574 +0000 UTC m=+1163.220931347" watchObservedRunningTime="2025-12-17 09:24:03.5633948 +0000 UTC m=+1163.223235553" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.573855 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vhjl6" event={"ID":"a62d0f30-735b-410e-ac80-50a98636ff47","Type":"ContainerStarted","Data":"146a5d0345e2e6feddae3f48799ff20c4811440e1a611f4126bce480d2675319"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.578534 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.578567 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pq972\" (UniqueName: \"kubernetes.io/projected/87f3771e-de24-4127-ad5b-9dbb32b32095-kube-api-access-pq972\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.578578 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.578588 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.578905 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f667bdd5-px4gj" event={"ID":"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc","Type":"ContainerStarted","Data":"10a0127f3352bde2295d276bf724d2c1d2fe80a9913c80791827a8053637abc7"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.580347 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f84dfc49-qqm76" event={"ID":"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3","Type":"ContainerStarted","Data":"85ead9b50b97c80f2f2b340cc9969a06d5e0500f541e8d7f950971d74ea7d46d"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.581358 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f7nmq" event={"ID":"9b17c8be-6039-4aa6-8227-cd2dfc076f77","Type":"ContainerStarted","Data":"9b25756dd849c95ef48605f4c64fc6e91c80c6ece3efdf637ce80865932b1bb6"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.584695 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rvtvc" event={"ID":"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0","Type":"ContainerStarted","Data":"52b728d267838ea5355f4043d93c8d4d7d483e4b5f5dd66b4b55d1dc1b0c1bde"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.587412 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-4297h" event={"ID":"87f3771e-de24-4127-ad5b-9dbb32b32095","Type":"ContainerDied","Data":"9e569cddeb697d54d03f99922ad713c4008b966aedc91cd7f11eea7ed4e391d3"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.587456 4935 scope.go:117] "RemoveContainer" containerID="adec9dbf411e9d2c74c968890c09feb1ea6e50019622a1586ee6102821b08159" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.587614 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-4297h" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.591995 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87f3771e-de24-4127-ad5b-9dbb32b32095" (UID: "87f3771e-de24-4127-ad5b-9dbb32b32095"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.597609 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "87f3771e-de24-4127-ad5b-9dbb32b32095" (UID: "87f3771e-de24-4127-ad5b-9dbb32b32095"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.601743 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8ctb" event={"ID":"99e284fa-5f85-409e-bcb3-fcb2b320a0fe","Type":"ContainerStarted","Data":"5c0545040d4c0d7a0defca6036355ee3fc839a7a9ac8518f0c60bdb0297de245"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.615568 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3ff58534-93f3-4ead-b612-4d851e84d804","Type":"ContainerStarted","Data":"857c8430bea78fdd09b224b104aa6140f4ed7296e9a0509ca67727c33fd0eced"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.630641 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687fb4989-g2n7c" event={"ID":"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7","Type":"ContainerStarted","Data":"a55f130981aac9a3c39485e3603f9df47ca81da4f60e2f47317b58b6103230b4"} Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.645470 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-g8ctb" podStartSLOduration=7.64408058 podStartE2EDuration="7.64408058s" podCreationTimestamp="2025-12-17 09:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:03.623645912 +0000 UTC m=+1163.283486675" watchObservedRunningTime="2025-12-17 09:24:03.64408058 +0000 UTC m=+1163.303921343" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.680207 4935 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.680242 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87f3771e-de24-4127-ad5b-9dbb32b32095-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:03 crc kubenswrapper[4935]: I1217 09:24:03.993487 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-4297h"] Dec 17 09:24:04 crc kubenswrapper[4935]: I1217 09:24:04.027743 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-4297h"] Dec 17 09:24:04 crc kubenswrapper[4935]: I1217 09:24:04.656141 4935 generic.go:334] "Generic (PLEG): container finished" podID="3c56c6f5-2db1-48f5-9190-8a76c2836d9f" containerID="fbfda067fe89cd68e6029cd01e67c3d8ffb56273d56b5fd1bc6f8d81ee608a87" exitCode=0 Dec 17 09:24:04 crc kubenswrapper[4935]: I1217 09:24:04.656365 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58647bbf65-n57wl" event={"ID":"3c56c6f5-2db1-48f5-9190-8a76c2836d9f","Type":"ContainerDied","Data":"fbfda067fe89cd68e6029cd01e67c3d8ffb56273d56b5fd1bc6f8d81ee608a87"} Dec 17 09:24:04 crc kubenswrapper[4935]: I1217 09:24:04.668853 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c184ea2-862c-40e2-b4a9-9ea6d75b8407","Type":"ContainerStarted","Data":"331f64e864b7101f412a16b576c7442295889194909a4751c136ea4ddca0d3c0"} Dec 17 09:24:04 crc kubenswrapper[4935]: I1217 09:24:04.700353 4935 generic.go:334] "Generic (PLEG): container finished" podID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" containerID="c7de911a37b6d5b1f5a2f4b12c118eaf3ac99c74e315beb0133a888b909c6748" exitCode=0 Dec 17 09:24:04 crc kubenswrapper[4935]: I1217 09:24:04.700482 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" event={"ID":"fe048f60-c05c-4561-88bc-42d1b9eecd6c","Type":"ContainerDied","Data":"c7de911a37b6d5b1f5a2f4b12c118eaf3ac99c74e315beb0133a888b909c6748"} Dec 17 09:24:04 crc kubenswrapper[4935]: I1217 09:24:04.745845 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3ff58534-93f3-4ead-b612-4d851e84d804","Type":"ContainerStarted","Data":"5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473"} Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.153917 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f3771e-de24-4127-ad5b-9dbb32b32095" path="/var/lib/kubelet/pods/87f3771e-de24-4127-ad5b-9dbb32b32095/volumes" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.195257 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.342061 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4br2\" (UniqueName: \"kubernetes.io/projected/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-kube-api-access-j4br2\") pod \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.342144 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-sb\") pod \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.342187 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-config\") pod \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.342205 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-svc\") pod \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.342396 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-nb\") pod \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.342427 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-swift-storage-0\") pod \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.373993 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-kube-api-access-j4br2" (OuterVolumeSpecName: "kube-api-access-j4br2") pod "3c56c6f5-2db1-48f5-9190-8a76c2836d9f" (UID: "3c56c6f5-2db1-48f5-9190-8a76c2836d9f"). InnerVolumeSpecName "kube-api-access-j4br2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.376211 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3c56c6f5-2db1-48f5-9190-8a76c2836d9f" (UID: "3c56c6f5-2db1-48f5-9190-8a76c2836d9f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.391109 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c56c6f5-2db1-48f5-9190-8a76c2836d9f" (UID: "3c56c6f5-2db1-48f5-9190-8a76c2836d9f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.397580 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-config" (OuterVolumeSpecName: "config") pod "3c56c6f5-2db1-48f5-9190-8a76c2836d9f" (UID: "3c56c6f5-2db1-48f5-9190-8a76c2836d9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:05 crc kubenswrapper[4935]: E1217 09:24:05.406769 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-nb podName:3c56c6f5-2db1-48f5-9190-8a76c2836d9f nodeName:}" failed. No retries permitted until 2025-12-17 09:24:05.906715976 +0000 UTC m=+1165.566556749 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-nb" (UniqueName: "kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-nb") pod "3c56c6f5-2db1-48f5-9190-8a76c2836d9f" (UID: "3c56c6f5-2db1-48f5-9190-8a76c2836d9f") : error deleting /var/lib/kubelet/pods/3c56c6f5-2db1-48f5-9190-8a76c2836d9f/volume-subpaths: remove /var/lib/kubelet/pods/3c56c6f5-2db1-48f5-9190-8a76c2836d9f/volume-subpaths: no such file or directory Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.407420 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3c56c6f5-2db1-48f5-9190-8a76c2836d9f" (UID: "3c56c6f5-2db1-48f5-9190-8a76c2836d9f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.445595 4935 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.445643 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4br2\" (UniqueName: \"kubernetes.io/projected/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-kube-api-access-j4br2\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.445657 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.445669 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.445682 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.767010 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" event={"ID":"fe048f60-c05c-4561-88bc-42d1b9eecd6c","Type":"ContainerStarted","Data":"2dcc04ce65f67c91261b2221371e56d87a43818d65b35702eb57dd40dcae49f2"} Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.767526 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.770181 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-n57wl" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.770166 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58647bbf65-n57wl" event={"ID":"3c56c6f5-2db1-48f5-9190-8a76c2836d9f","Type":"ContainerDied","Data":"ef57f061a66e38a04dae08ec15b5758f87ceac00864f83cacda9616968661e4b"} Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.770438 4935 scope.go:117] "RemoveContainer" containerID="fbfda067fe89cd68e6029cd01e67c3d8ffb56273d56b5fd1bc6f8d81ee608a87" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.772804 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c184ea2-862c-40e2-b4a9-9ea6d75b8407","Type":"ContainerStarted","Data":"acb56f7b5ea41e4498316145a7fb828bf82b4ee9f53551d2e249c9f03e7c7fd0"} Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.773012 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" containerName="glance-log" containerID="cri-o://331f64e864b7101f412a16b576c7442295889194909a4751c136ea4ddca0d3c0" gracePeriod=30 Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.773036 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" containerName="glance-httpd" containerID="cri-o://acb56f7b5ea41e4498316145a7fb828bf82b4ee9f53551d2e249c9f03e7c7fd0" gracePeriod=30 Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.829479 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" podStartSLOduration=9.82944985 podStartE2EDuration="9.82944985s" podCreationTimestamp="2025-12-17 09:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:05.814784433 +0000 UTC m=+1165.474625196" watchObservedRunningTime="2025-12-17 09:24:05.82944985 +0000 UTC m=+1165.489290613" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.893801 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=9.893772751 podStartE2EDuration="9.893772751s" podCreationTimestamp="2025-12-17 09:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:05.890753838 +0000 UTC m=+1165.550594601" watchObservedRunningTime="2025-12-17 09:24:05.893772751 +0000 UTC m=+1165.553613514" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.956934 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-nb\") pod \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\" (UID: \"3c56c6f5-2db1-48f5-9190-8a76c2836d9f\") " Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.957852 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3c56c6f5-2db1-48f5-9190-8a76c2836d9f" (UID: "3c56c6f5-2db1-48f5-9190-8a76c2836d9f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:05 crc kubenswrapper[4935]: I1217 09:24:05.958053 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c56c6f5-2db1-48f5-9190-8a76c2836d9f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:06 crc kubenswrapper[4935]: I1217 09:24:06.242975 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-n57wl"] Dec 17 09:24:06 crc kubenswrapper[4935]: I1217 09:24:06.261797 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-n57wl"] Dec 17 09:24:06 crc kubenswrapper[4935]: I1217 09:24:06.791235 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3ff58534-93f3-4ead-b612-4d851e84d804","Type":"ContainerStarted","Data":"ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2"} Dec 17 09:24:06 crc kubenswrapper[4935]: I1217 09:24:06.791475 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3ff58534-93f3-4ead-b612-4d851e84d804" containerName="glance-log" containerID="cri-o://5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473" gracePeriod=30 Dec 17 09:24:06 crc kubenswrapper[4935]: I1217 09:24:06.792198 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3ff58534-93f3-4ead-b612-4d851e84d804" containerName="glance-httpd" containerID="cri-o://ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2" gracePeriod=30 Dec 17 09:24:06 crc kubenswrapper[4935]: I1217 09:24:06.812209 4935 generic.go:334] "Generic (PLEG): container finished" podID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" containerID="acb56f7b5ea41e4498316145a7fb828bf82b4ee9f53551d2e249c9f03e7c7fd0" exitCode=0 Dec 17 09:24:06 crc kubenswrapper[4935]: I1217 09:24:06.812237 4935 generic.go:334] "Generic (PLEG): container finished" podID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" containerID="331f64e864b7101f412a16b576c7442295889194909a4751c136ea4ddca0d3c0" exitCode=143 Dec 17 09:24:06 crc kubenswrapper[4935]: I1217 09:24:06.813499 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c184ea2-862c-40e2-b4a9-9ea6d75b8407","Type":"ContainerDied","Data":"acb56f7b5ea41e4498316145a7fb828bf82b4ee9f53551d2e249c9f03e7c7fd0"} Dec 17 09:24:06 crc kubenswrapper[4935]: I1217 09:24:06.813535 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c184ea2-862c-40e2-b4a9-9ea6d75b8407","Type":"ContainerDied","Data":"331f64e864b7101f412a16b576c7442295889194909a4751c136ea4ddca0d3c0"} Dec 17 09:24:06 crc kubenswrapper[4935]: I1217 09:24:06.820597 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.820574205 podStartE2EDuration="9.820574205s" podCreationTimestamp="2025-12-17 09:23:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:06.818100905 +0000 UTC m=+1166.477941668" watchObservedRunningTime="2025-12-17 09:24:06.820574205 +0000 UTC m=+1166.480414958" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.144488 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c56c6f5-2db1-48f5-9190-8a76c2836d9f" path="/var/lib/kubelet/pods/3c56c6f5-2db1-48f5-9190-8a76c2836d9f/volumes" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.260589 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.305322 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-httpd-run\") pod \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.305820 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qnwd\" (UniqueName: \"kubernetes.io/projected/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-kube-api-access-2qnwd\") pod \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.305923 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-combined-ca-bundle\") pod \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.305986 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-config-data\") pod \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.306025 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-logs\") pod \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.306056 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.306086 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-scripts\") pod \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\" (UID: \"2c184ea2-862c-40e2-b4a9-9ea6d75b8407\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.309868 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-logs" (OuterVolumeSpecName: "logs") pod "2c184ea2-862c-40e2-b4a9-9ea6d75b8407" (UID: "2c184ea2-862c-40e2-b4a9-9ea6d75b8407"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.310045 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2c184ea2-862c-40e2-b4a9-9ea6d75b8407" (UID: "2c184ea2-862c-40e2-b4a9-9ea6d75b8407"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.323786 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "2c184ea2-862c-40e2-b4a9-9ea6d75b8407" (UID: "2c184ea2-862c-40e2-b4a9-9ea6d75b8407"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.324033 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-kube-api-access-2qnwd" (OuterVolumeSpecName: "kube-api-access-2qnwd") pod "2c184ea2-862c-40e2-b4a9-9ea6d75b8407" (UID: "2c184ea2-862c-40e2-b4a9-9ea6d75b8407"). InnerVolumeSpecName "kube-api-access-2qnwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.341257 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-scripts" (OuterVolumeSpecName: "scripts") pod "2c184ea2-862c-40e2-b4a9-9ea6d75b8407" (UID: "2c184ea2-862c-40e2-b4a9-9ea6d75b8407"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.357833 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c184ea2-862c-40e2-b4a9-9ea6d75b8407" (UID: "2c184ea2-862c-40e2-b4a9-9ea6d75b8407"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.418427 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qnwd\" (UniqueName: \"kubernetes.io/projected/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-kube-api-access-2qnwd\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.418831 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.419018 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.419244 4935 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.419642 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.419762 4935 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.423806 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-config-data" (OuterVolumeSpecName: "config-data") pod "2c184ea2-862c-40e2-b4a9-9ea6d75b8407" (UID: "2c184ea2-862c-40e2-b4a9-9ea6d75b8407"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.466525 4935 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.521452 4935 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.521496 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c184ea2-862c-40e2-b4a9-9ea6d75b8407-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.606856 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.723960 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-config-data\") pod \"3ff58534-93f3-4ead-b612-4d851e84d804\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.723999 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-scripts\") pod \"3ff58534-93f3-4ead-b612-4d851e84d804\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.725467 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-logs\") pod \"3ff58534-93f3-4ead-b612-4d851e84d804\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.725942 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-logs" (OuterVolumeSpecName: "logs") pod "3ff58534-93f3-4ead-b612-4d851e84d804" (UID: "3ff58534-93f3-4ead-b612-4d851e84d804"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.726044 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78w5w\" (UniqueName: \"kubernetes.io/projected/3ff58534-93f3-4ead-b612-4d851e84d804-kube-api-access-78w5w\") pod \"3ff58534-93f3-4ead-b612-4d851e84d804\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.726108 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-httpd-run\") pod \"3ff58534-93f3-4ead-b612-4d851e84d804\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.726606 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3ff58534-93f3-4ead-b612-4d851e84d804" (UID: "3ff58534-93f3-4ead-b612-4d851e84d804"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.727378 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"3ff58534-93f3-4ead-b612-4d851e84d804\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.727490 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-combined-ca-bundle\") pod \"3ff58534-93f3-4ead-b612-4d851e84d804\" (UID: \"3ff58534-93f3-4ead-b612-4d851e84d804\") " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.728357 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.728393 4935 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3ff58534-93f3-4ead-b612-4d851e84d804-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.730521 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ff58534-93f3-4ead-b612-4d851e84d804-kube-api-access-78w5w" (OuterVolumeSpecName: "kube-api-access-78w5w") pod "3ff58534-93f3-4ead-b612-4d851e84d804" (UID: "3ff58534-93f3-4ead-b612-4d851e84d804"). InnerVolumeSpecName "kube-api-access-78w5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.735066 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-scripts" (OuterVolumeSpecName: "scripts") pod "3ff58534-93f3-4ead-b612-4d851e84d804" (UID: "3ff58534-93f3-4ead-b612-4d851e84d804"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.736645 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "3ff58534-93f3-4ead-b612-4d851e84d804" (UID: "3ff58534-93f3-4ead-b612-4d851e84d804"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.773461 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ff58534-93f3-4ead-b612-4d851e84d804" (UID: "3ff58534-93f3-4ead-b612-4d851e84d804"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.778221 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-config-data" (OuterVolumeSpecName: "config-data") pod "3ff58534-93f3-4ead-b612-4d851e84d804" (UID: "3ff58534-93f3-4ead-b612-4d851e84d804"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.830720 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.830758 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.830771 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78w5w\" (UniqueName: \"kubernetes.io/projected/3ff58534-93f3-4ead-b612-4d851e84d804-kube-api-access-78w5w\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.830808 4935 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.830821 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ff58534-93f3-4ead-b612-4d851e84d804-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.852490 4935 generic.go:334] "Generic (PLEG): container finished" podID="3ff58534-93f3-4ead-b612-4d851e84d804" containerID="ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2" exitCode=0 Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.852540 4935 generic.go:334] "Generic (PLEG): container finished" podID="3ff58534-93f3-4ead-b612-4d851e84d804" containerID="5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473" exitCode=143 Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.852671 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3ff58534-93f3-4ead-b612-4d851e84d804","Type":"ContainerDied","Data":"ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2"} Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.852706 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.852729 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3ff58534-93f3-4ead-b612-4d851e84d804","Type":"ContainerDied","Data":"5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473"} Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.852745 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3ff58534-93f3-4ead-b612-4d851e84d804","Type":"ContainerDied","Data":"857c8430bea78fdd09b224b104aa6140f4ed7296e9a0509ca67727c33fd0eced"} Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.852764 4935 scope.go:117] "RemoveContainer" containerID="ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.855503 4935 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.859704 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2c184ea2-862c-40e2-b4a9-9ea6d75b8407","Type":"ContainerDied","Data":"8c77e949281ff9338d654fe5780f52ec517e82e70c185841087c1b6b466d83f6"} Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.859797 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.901822 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.914320 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.935511 4935 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.939837 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.971990 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:07 crc kubenswrapper[4935]: I1217 09:24:07.978496 4935 scope.go:117] "RemoveContainer" containerID="5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.021346 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:08 crc kubenswrapper[4935]: E1217 09:24:08.021845 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff58534-93f3-4ead-b612-4d851e84d804" containerName="glance-log" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.021869 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff58534-93f3-4ead-b612-4d851e84d804" containerName="glance-log" Dec 17 09:24:08 crc kubenswrapper[4935]: E1217 09:24:08.021890 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f3771e-de24-4127-ad5b-9dbb32b32095" containerName="init" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.021897 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f3771e-de24-4127-ad5b-9dbb32b32095" containerName="init" Dec 17 09:24:08 crc kubenswrapper[4935]: E1217 09:24:08.021909 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ff58534-93f3-4ead-b612-4d851e84d804" containerName="glance-httpd" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.021915 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ff58534-93f3-4ead-b612-4d851e84d804" containerName="glance-httpd" Dec 17 09:24:08 crc kubenswrapper[4935]: E1217 09:24:08.021927 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c56c6f5-2db1-48f5-9190-8a76c2836d9f" containerName="init" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.021933 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c56c6f5-2db1-48f5-9190-8a76c2836d9f" containerName="init" Dec 17 09:24:08 crc kubenswrapper[4935]: E1217 09:24:08.021951 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" containerName="glance-httpd" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.021956 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" containerName="glance-httpd" Dec 17 09:24:08 crc kubenswrapper[4935]: E1217 09:24:08.021976 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" containerName="glance-log" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.021983 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" containerName="glance-log" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.022156 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff58534-93f3-4ead-b612-4d851e84d804" containerName="glance-httpd" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.022175 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" containerName="glance-httpd" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.022183 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f3771e-de24-4127-ad5b-9dbb32b32095" containerName="init" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.022188 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c56c6f5-2db1-48f5-9190-8a76c2836d9f" containerName="init" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.022203 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ff58534-93f3-4ead-b612-4d851e84d804" containerName="glance-log" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.022216 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" containerName="glance-log" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.023265 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.032328 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.034101 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.035116 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.035211 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-2lgf4" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.049532 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.051476 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.054326 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.066121 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.081686 4935 scope.go:117] "RemoveContainer" containerID="ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2" Dec 17 09:24:08 crc kubenswrapper[4935]: E1217 09:24:08.086749 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2\": container with ID starting with ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2 not found: ID does not exist" containerID="ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.086819 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2"} err="failed to get container status \"ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2\": rpc error: code = NotFound desc = could not find container \"ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2\": container with ID starting with ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2 not found: ID does not exist" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.086855 4935 scope.go:117] "RemoveContainer" containerID="5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473" Dec 17 09:24:08 crc kubenswrapper[4935]: E1217 09:24:08.088030 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473\": container with ID starting with 5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473 not found: ID does not exist" containerID="5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.088058 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473"} err="failed to get container status \"5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473\": rpc error: code = NotFound desc = could not find container \"5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473\": container with ID starting with 5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473 not found: ID does not exist" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.088075 4935 scope.go:117] "RemoveContainer" containerID="ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.088363 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2"} err="failed to get container status \"ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2\": rpc error: code = NotFound desc = could not find container \"ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2\": container with ID starting with ac8f712639c6420afe9f4a5532e0e70cf225e2cb340acf9d878ef9948c99eea2 not found: ID does not exist" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.088377 4935 scope.go:117] "RemoveContainer" containerID="5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.089599 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473"} err="failed to get container status \"5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473\": rpc error: code = NotFound desc = could not find container \"5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473\": container with ID starting with 5f4faa48fa10f81c14d4248ff733d03e841fbe4f075077de2a70025760e51473 not found: ID does not exist" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.089655 4935 scope.go:117] "RemoveContainer" containerID="acb56f7b5ea41e4498316145a7fb828bf82b4ee9f53551d2e249c9f03e7c7fd0" Dec 17 09:24:08 crc kubenswrapper[4935]: E1217 09:24:08.120017 4935 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ff58534_93f3_4ead_b612_4d851e84d804.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ff58534_93f3_4ead_b612_4d851e84d804.slice/crio-857c8430bea78fdd09b224b104aa6140f4ed7296e9a0509ca67727c33fd0eced\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c184ea2_862c_40e2_b4a9_9ea6d75b8407.slice\": RecentStats: unable to find data in memory cache]" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.140977 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.141040 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.141085 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-logs\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.141120 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.141235 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.141298 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.141328 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjh7\" (UniqueName: \"kubernetes.io/projected/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-kube-api-access-kpjh7\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.203589 4935 scope.go:117] "RemoveContainer" containerID="331f64e864b7101f412a16b576c7442295889194909a4751c136ea4ddca0d3c0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.242690 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-logs\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.244564 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.244598 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.244652 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-logs\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.244698 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.244813 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.244857 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxmtf\" (UniqueName: \"kubernetes.io/projected/8f5ef1dd-6982-47e6-91b2-ecac656078e6-kube-api-access-rxmtf\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.244906 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.244957 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.244987 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.245027 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.245046 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.245080 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjh7\" (UniqueName: \"kubernetes.io/projected/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-kube-api-access-kpjh7\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.245193 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.247114 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.247228 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.247547 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-logs\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.257169 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.263954 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.267904 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.271960 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjh7\" (UniqueName: \"kubernetes.io/projected/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-kube-api-access-kpjh7\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.291550 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.347451 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.347555 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.347579 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-logs\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.347655 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.347680 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxmtf\" (UniqueName: \"kubernetes.io/projected/8f5ef1dd-6982-47e6-91b2-ecac656078e6-kube-api-access-rxmtf\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.347714 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.347734 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.352008 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.352750 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-logs\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.358846 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.369855 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.375021 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.379451 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-config-data\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.381160 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-scripts\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.381453 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxmtf\" (UniqueName: \"kubernetes.io/projected/8f5ef1dd-6982-47e6-91b2-ecac656078e6-kube-api-access-rxmtf\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.430230 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:08 crc kubenswrapper[4935]: E1217 09:24:08.434204 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-default-external-api-0" podUID="8f5ef1dd-6982-47e6-91b2-ecac656078e6" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.465571 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.581834 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.670946 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-687fb4989-g2n7c"] Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.710153 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54c44548bb-26j2l"] Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.712624 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.724855 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54c44548bb-26j2l"] Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.733858 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.800551 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f84dfc49-qqm76"] Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.865888 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bbfb6547d-64jt7"] Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.866006 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-config-data\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.866080 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-secret-key\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.866122 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-tls-certs\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.866146 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-combined-ca-bundle\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.866222 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tf48\" (UniqueName: \"kubernetes.io/projected/5385d045-3f7c-447d-8ce8-d12a8de0cdce-kube-api-access-4tf48\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.866243 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-scripts\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.866305 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5385d045-3f7c-447d-8ce8-d12a8de0cdce-logs\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.868815 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.903125 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bbfb6547d-64jt7"] Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.915301 4935 generic.go:334] "Generic (PLEG): container finished" podID="a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" containerID="bff53f3f3f28cf1756acd22727f439be8690f1448e706dcfcc2e884c82114623" exitCode=0 Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.915420 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gsvpp" event={"ID":"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f","Type":"ContainerDied","Data":"bff53f3f3f28cf1756acd22727f439be8690f1448e706dcfcc2e884c82114623"} Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.924867 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.971440 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.992192 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jd8k\" (UniqueName: \"kubernetes.io/projected/3658abd7-bc1e-4359-aa8b-011fe7189342-kube-api-access-8jd8k\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.995488 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tf48\" (UniqueName: \"kubernetes.io/projected/5385d045-3f7c-447d-8ce8-d12a8de0cdce-kube-api-access-4tf48\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.995610 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-scripts\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:08 crc kubenswrapper[4935]: I1217 09:24:08.995743 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3658abd7-bc1e-4359-aa8b-011fe7189342-scripts\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:08.996668 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-scripts\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.012554 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5385d045-3f7c-447d-8ce8-d12a8de0cdce-logs\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.012992 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3658abd7-bc1e-4359-aa8b-011fe7189342-logs\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.013107 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-config-data\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.013192 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3658abd7-bc1e-4359-aa8b-011fe7189342-horizon-tls-certs\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.013377 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-secret-key\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.013546 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3658abd7-bc1e-4359-aa8b-011fe7189342-combined-ca-bundle\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.013632 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-tls-certs\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.013747 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-combined-ca-bundle\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.013828 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5385d045-3f7c-447d-8ce8-d12a8de0cdce-logs\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.013850 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3658abd7-bc1e-4359-aa8b-011fe7189342-config-data\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.014021 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3658abd7-bc1e-4359-aa8b-011fe7189342-horizon-secret-key\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.016409 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-config-data\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.024651 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-combined-ca-bundle\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.025629 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-secret-key\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.036180 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tf48\" (UniqueName: \"kubernetes.io/projected/5385d045-3f7c-447d-8ce8-d12a8de0cdce-kube-api-access-4tf48\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.037107 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-tls-certs\") pod \"horizon-54c44548bb-26j2l\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.061667 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.117082 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-logs\") pod \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.117159 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-scripts\") pod \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.117249 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-config-data\") pod \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.117288 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.117324 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxmtf\" (UniqueName: \"kubernetes.io/projected/8f5ef1dd-6982-47e6-91b2-ecac656078e6-kube-api-access-rxmtf\") pod \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.118088 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-httpd-run\") pod \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.118519 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-logs" (OuterVolumeSpecName: "logs") pod "8f5ef1dd-6982-47e6-91b2-ecac656078e6" (UID: "8f5ef1dd-6982-47e6-91b2-ecac656078e6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.118262 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-combined-ca-bundle\") pod \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\" (UID: \"8f5ef1dd-6982-47e6-91b2-ecac656078e6\") " Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.119133 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8f5ef1dd-6982-47e6-91b2-ecac656078e6" (UID: "8f5ef1dd-6982-47e6-91b2-ecac656078e6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.119212 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3658abd7-bc1e-4359-aa8b-011fe7189342-logs\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.119293 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3658abd7-bc1e-4359-aa8b-011fe7189342-horizon-tls-certs\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.119547 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3658abd7-bc1e-4359-aa8b-011fe7189342-combined-ca-bundle\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.119630 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3658abd7-bc1e-4359-aa8b-011fe7189342-config-data\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.119654 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3658abd7-bc1e-4359-aa8b-011fe7189342-horizon-secret-key\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.119773 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jd8k\" (UniqueName: \"kubernetes.io/projected/3658abd7-bc1e-4359-aa8b-011fe7189342-kube-api-access-8jd8k\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.119825 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3658abd7-bc1e-4359-aa8b-011fe7189342-logs\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.119868 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3658abd7-bc1e-4359-aa8b-011fe7189342-scripts\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.120087 4935 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.120100 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8f5ef1dd-6982-47e6-91b2-ecac656078e6-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.121544 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3658abd7-bc1e-4359-aa8b-011fe7189342-config-data\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.123133 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3658abd7-bc1e-4359-aa8b-011fe7189342-scripts\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.126844 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3658abd7-bc1e-4359-aa8b-011fe7189342-horizon-secret-key\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.134193 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5ef1dd-6982-47e6-91b2-ecac656078e6-kube-api-access-rxmtf" (OuterVolumeSpecName: "kube-api-access-rxmtf") pod "8f5ef1dd-6982-47e6-91b2-ecac656078e6" (UID: "8f5ef1dd-6982-47e6-91b2-ecac656078e6"). InnerVolumeSpecName "kube-api-access-rxmtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.138439 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8f5ef1dd-6982-47e6-91b2-ecac656078e6" (UID: "8f5ef1dd-6982-47e6-91b2-ecac656078e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.144052 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3658abd7-bc1e-4359-aa8b-011fe7189342-combined-ca-bundle\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.147920 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-config-data" (OuterVolumeSpecName: "config-data") pod "8f5ef1dd-6982-47e6-91b2-ecac656078e6" (UID: "8f5ef1dd-6982-47e6-91b2-ecac656078e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.148513 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "8f5ef1dd-6982-47e6-91b2-ecac656078e6" (UID: "8f5ef1dd-6982-47e6-91b2-ecac656078e6"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.150948 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-scripts" (OuterVolumeSpecName: "scripts") pod "8f5ef1dd-6982-47e6-91b2-ecac656078e6" (UID: "8f5ef1dd-6982-47e6-91b2-ecac656078e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.160401 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/3658abd7-bc1e-4359-aa8b-011fe7189342-horizon-tls-certs\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.163031 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c184ea2-862c-40e2-b4a9-9ea6d75b8407" path="/var/lib/kubelet/pods/2c184ea2-862c-40e2-b4a9-9ea6d75b8407/volumes" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.163903 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ff58534-93f3-4ead-b612-4d851e84d804" path="/var/lib/kubelet/pods/3ff58534-93f3-4ead-b612-4d851e84d804/volumes" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.181030 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jd8k\" (UniqueName: \"kubernetes.io/projected/3658abd7-bc1e-4359-aa8b-011fe7189342-kube-api-access-8jd8k\") pod \"horizon-7bbfb6547d-64jt7\" (UID: \"3658abd7-bc1e-4359-aa8b-011fe7189342\") " pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.224737 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.224774 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.224784 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8f5ef1dd-6982-47e6-91b2-ecac656078e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.224806 4935 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.224820 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxmtf\" (UniqueName: \"kubernetes.io/projected/8f5ef1dd-6982-47e6-91b2-ecac656078e6-kube-api-access-rxmtf\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.228930 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.265170 4935 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.327795 4935 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:09 crc kubenswrapper[4935]: I1217 09:24:09.939482 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.020265 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.035822 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.058940 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.066689 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.072619 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.073137 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.088542 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.146506 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.146591 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-config-data\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.146650 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.146697 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-scripts\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.146770 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.146802 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.146863 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xwjs\" (UniqueName: \"kubernetes.io/projected/abc82931-62eb-42ba-952d-9b1c99f2fd25-kube-api-access-9xwjs\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.146909 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-logs\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.248958 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-scripts\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.249040 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.249081 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.249118 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xwjs\" (UniqueName: \"kubernetes.io/projected/abc82931-62eb-42ba-952d-9b1c99f2fd25-kube-api-access-9xwjs\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.249164 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-logs\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.249244 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.249357 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-config-data\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.249403 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.249877 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.250474 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.250512 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-logs\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.259181 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-config-data\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.259920 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.261782 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.266310 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-scripts\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.271193 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xwjs\" (UniqueName: \"kubernetes.io/projected/abc82931-62eb-42ba-952d-9b1c99f2fd25-kube-api-access-9xwjs\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.318701 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " pod="openstack/glance-default-external-api-0" Dec 17 09:24:10 crc kubenswrapper[4935]: I1217 09:24:10.390968 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:24:11 crc kubenswrapper[4935]: I1217 09:24:11.135464 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5ef1dd-6982-47e6-91b2-ecac656078e6" path="/var/lib/kubelet/pods/8f5ef1dd-6982-47e6-91b2-ecac656078e6/volumes" Dec 17 09:24:12 crc kubenswrapper[4935]: I1217 09:24:12.418571 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:24:12 crc kubenswrapper[4935]: I1217 09:24:12.492450 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-96f46"] Dec 17 09:24:12 crc kubenswrapper[4935]: I1217 09:24:12.492756 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerName="dnsmasq-dns" containerID="cri-o://ae969a21e12d842efa534d2ea406cd7a71b99213044780025d473d6e16ccf39f" gracePeriod=10 Dec 17 09:24:12 crc kubenswrapper[4935]: I1217 09:24:12.984872 4935 generic.go:334] "Generic (PLEG): container finished" podID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerID="ae969a21e12d842efa534d2ea406cd7a71b99213044780025d473d6e16ccf39f" exitCode=0 Dec 17 09:24:12 crc kubenswrapper[4935]: I1217 09:24:12.984915 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" event={"ID":"582e42e7-25fe-4a3e-9a76-87f9c62af78c","Type":"ContainerDied","Data":"ae969a21e12d842efa534d2ea406cd7a71b99213044780025d473d6e16ccf39f"} Dec 17 09:24:17 crc kubenswrapper[4935]: I1217 09:24:17.493507 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.618395 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.709735 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-fernet-keys\") pod \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.709809 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-combined-ca-bundle\") pod \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.709898 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-scripts\") pod \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.710310 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5pmt\" (UniqueName: \"kubernetes.io/projected/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-kube-api-access-m5pmt\") pod \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.710374 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-credential-keys\") pod \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.710399 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-config-data\") pod \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.732532 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" (UID: "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.733327 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" (UID: "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.735406 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-scripts" (OuterVolumeSpecName: "scripts") pod "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" (UID: "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.777292 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-kube-api-access-m5pmt" (OuterVolumeSpecName: "kube-api-access-m5pmt") pod "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" (UID: "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f"). InnerVolumeSpecName "kube-api-access-m5pmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:19 crc kubenswrapper[4935]: E1217 09:24:19.777348 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-config-data podName:a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f nodeName:}" failed. No retries permitted until 2025-12-17 09:24:20.277305159 +0000 UTC m=+1179.937145922 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-config-data") pod "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" (UID: "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f") : error deleting /var/lib/kubelet/pods/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f/volume-subpaths: remove /var/lib/kubelet/pods/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f/volume-subpaths: no such file or directory Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.784535 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" (UID: "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.815925 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5pmt\" (UniqueName: \"kubernetes.io/projected/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-kube-api-access-m5pmt\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.816373 4935 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.816499 4935 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.816594 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:19 crc kubenswrapper[4935]: I1217 09:24:19.816683 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.060764 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gsvpp" event={"ID":"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f","Type":"ContainerDied","Data":"b5b40c372729d3da302a15eba2992ceb7f4c853f694761bcf331ce38ddef9e2b"} Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.060827 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5b40c372729d3da302a15eba2992ceb7f4c853f694761bcf331ce38ddef9e2b" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.060875 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gsvpp" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.330775 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-config-data\") pod \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\" (UID: \"a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f\") " Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.341439 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-config-data" (OuterVolumeSpecName: "config-data") pod "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" (UID: "a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.434010 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.723325 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gsvpp"] Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.733798 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gsvpp"] Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.825218 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xk2l5"] Dec 17 09:24:20 crc kubenswrapper[4935]: E1217 09:24:20.825768 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" containerName="keystone-bootstrap" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.825795 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" containerName="keystone-bootstrap" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.826004 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" containerName="keystone-bootstrap" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.826912 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.833876 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.834380 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.834606 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bnktl" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.834886 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.835733 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.846759 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xk2l5"] Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.945915 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htqvd\" (UniqueName: \"kubernetes.io/projected/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-kube-api-access-htqvd\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.946035 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-combined-ca-bundle\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.946094 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-fernet-keys\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.946173 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-scripts\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.946253 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-config-data\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:20 crc kubenswrapper[4935]: I1217 09:24:20.946321 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-credential-keys\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.048503 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-fernet-keys\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.048660 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-scripts\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.048881 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-config-data\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.048936 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-credential-keys\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.049058 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htqvd\" (UniqueName: \"kubernetes.io/projected/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-kube-api-access-htqvd\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.049140 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-combined-ca-bundle\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.054572 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-credential-keys\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.056665 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-config-data\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.056767 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-combined-ca-bundle\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.057452 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-scripts\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.057791 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-fernet-keys\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.085719 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htqvd\" (UniqueName: \"kubernetes.io/projected/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-kube-api-access-htqvd\") pod \"keystone-bootstrap-xk2l5\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.150508 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f" path="/var/lib/kubelet/pods/a40a609b-ecb4-4c7b-9b4e-34d6f3f2046f/volumes" Dec 17 09:24:21 crc kubenswrapper[4935]: I1217 09:24:21.152015 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:22 crc kubenswrapper[4935]: E1217 09:24:22.111657 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b" Dec 17 09:24:22 crc kubenswrapper[4935]: E1217 09:24:22.112351 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w8dg8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-rvtvc_openstack(0b52e650-9c70-4617-9fbb-12fbb5a1c3e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:24:22 crc kubenswrapper[4935]: E1217 09:24:22.113740 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-rvtvc" podUID="0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" Dec 17 09:24:23 crc kubenswrapper[4935]: E1217 09:24:23.113208 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b\\\"\"" pod="openstack/placement-db-sync-rvtvc" podUID="0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" Dec 17 09:24:25 crc kubenswrapper[4935]: E1217 09:24:25.127932 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:dd7600bc5278c663cfcfecafd3fb051a2cd2ddc3c1efb07738bf09512aa23ae7" Dec 17 09:24:25 crc kubenswrapper[4935]: E1217 09:24:25.128761 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:dd7600bc5278c663cfcfecafd3fb051a2cd2ddc3c1efb07738bf09512aa23ae7,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n677h588h647h656h97h77h66hfbh58ch59ch654h5fbhc6h5cch67fh98h5d4hdbh655h5fch5dfhcfhfch686h9bh649hcch5cch547h5cbh575h5bbq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2vz4v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7f84dfc49-qqm76_openstack(f0e28daf-a4e4-4dce-b43a-27a779a7e0e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:24:25 crc kubenswrapper[4935]: E1217 09:24:25.131826 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:dd7600bc5278c663cfcfecafd3fb051a2cd2ddc3c1efb07738bf09512aa23ae7\\\"\"]" pod="openstack/horizon-7f84dfc49-qqm76" podUID="f0e28daf-a4e4-4dce-b43a-27a779a7e0e3" Dec 17 09:24:27 crc kubenswrapper[4935]: I1217 09:24:27.493148 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 17 09:24:30 crc kubenswrapper[4935]: I1217 09:24:30.131005 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:24:30 crc kubenswrapper[4935]: I1217 09:24:30.131707 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:24:30 crc kubenswrapper[4935]: I1217 09:24:30.131756 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:24:30 crc kubenswrapper[4935]: I1217 09:24:30.132349 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2296c1b96eef7533eacbd1e8dfbd023ac687a2c9a2c2ca41bf8bcc87a90001d"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:24:30 crc kubenswrapper[4935]: I1217 09:24:30.132425 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://b2296c1b96eef7533eacbd1e8dfbd023ac687a2c9a2c2ca41bf8bcc87a90001d" gracePeriod=600 Dec 17 09:24:31 crc kubenswrapper[4935]: I1217 09:24:31.197673 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="b2296c1b96eef7533eacbd1e8dfbd023ac687a2c9a2c2ca41bf8bcc87a90001d" exitCode=0 Dec 17 09:24:31 crc kubenswrapper[4935]: I1217 09:24:31.197726 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"b2296c1b96eef7533eacbd1e8dfbd023ac687a2c9a2c2ca41bf8bcc87a90001d"} Dec 17 09:24:32 crc kubenswrapper[4935]: I1217 09:24:32.493781 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 17 09:24:32 crc kubenswrapper[4935]: I1217 09:24:32.494458 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.586372 4935 scope.go:117] "RemoveContainer" containerID="4beda71bb40d3ba3cd8691bc4fb1531ee2f28ecc5ac8964f7b4a375581ddcde6" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.703873 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.848665 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-dns-svc\") pod \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.848767 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-sb\") pod \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.848889 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85j82\" (UniqueName: \"kubernetes.io/projected/582e42e7-25fe-4a3e-9a76-87f9c62af78c-kube-api-access-85j82\") pod \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.849064 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-nb\") pod \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.849150 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-config\") pod \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\" (UID: \"582e42e7-25fe-4a3e-9a76-87f9c62af78c\") " Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.856124 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/582e42e7-25fe-4a3e-9a76-87f9c62af78c-kube-api-access-85j82" (OuterVolumeSpecName: "kube-api-access-85j82") pod "582e42e7-25fe-4a3e-9a76-87f9c62af78c" (UID: "582e42e7-25fe-4a3e-9a76-87f9c62af78c"). InnerVolumeSpecName "kube-api-access-85j82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.896618 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "582e42e7-25fe-4a3e-9a76-87f9c62af78c" (UID: "582e42e7-25fe-4a3e-9a76-87f9c62af78c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.899918 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "582e42e7-25fe-4a3e-9a76-87f9c62af78c" (UID: "582e42e7-25fe-4a3e-9a76-87f9c62af78c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.911588 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "582e42e7-25fe-4a3e-9a76-87f9c62af78c" (UID: "582e42e7-25fe-4a3e-9a76-87f9c62af78c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.917453 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-config" (OuterVolumeSpecName: "config") pod "582e42e7-25fe-4a3e-9a76-87f9c62af78c" (UID: "582e42e7-25fe-4a3e-9a76-87f9c62af78c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.951794 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85j82\" (UniqueName: \"kubernetes.io/projected/582e42e7-25fe-4a3e-9a76-87f9c62af78c-kube-api-access-85j82\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.951850 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.951865 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.951973 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:33 crc kubenswrapper[4935]: I1217 09:24:33.952006 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/582e42e7-25fe-4a3e-9a76-87f9c62af78c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:34 crc kubenswrapper[4935]: E1217 09:24:34.213111 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Dec 17 09:24:34 crc kubenswrapper[4935]: E1217 09:24:34.213310 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4gz8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-vhjl6_openstack(a62d0f30-735b-410e-ac80-50a98636ff47): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:24:34 crc kubenswrapper[4935]: E1217 09:24:34.214442 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-vhjl6" podUID="a62d0f30-735b-410e-ac80-50a98636ff47" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.216609 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.233883 4935 generic.go:334] "Generic (PLEG): container finished" podID="99e284fa-5f85-409e-bcb3-fcb2b320a0fe" containerID="5c0545040d4c0d7a0defca6036355ee3fc839a7a9ac8518f0c60bdb0297de245" exitCode=0 Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.233960 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8ctb" event={"ID":"99e284fa-5f85-409e-bcb3-fcb2b320a0fe","Type":"ContainerDied","Data":"5c0545040d4c0d7a0defca6036355ee3fc839a7a9ac8518f0c60bdb0297de245"} Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.236017 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f84dfc49-qqm76" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.236030 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f84dfc49-qqm76" event={"ID":"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3","Type":"ContainerDied","Data":"85ead9b50b97c80f2f2b340cc9969a06d5e0500f541e8d7f950971d74ea7d46d"} Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.242790 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.242745 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" event={"ID":"582e42e7-25fe-4a3e-9a76-87f9c62af78c","Type":"ContainerDied","Data":"0ac329e8458184dc3d8cda0e5abe6f7ba22f23cd38dddcff996dbfa444691319"} Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.263929 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-config-data\") pod \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.263999 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vz4v\" (UniqueName: \"kubernetes.io/projected/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-kube-api-access-2vz4v\") pod \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.264065 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-scripts\") pod \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.264115 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-horizon-secret-key\") pod \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.264177 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-logs\") pod \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\" (UID: \"f0e28daf-a4e4-4dce-b43a-27a779a7e0e3\") " Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.279365 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-scripts" (OuterVolumeSpecName: "scripts") pod "f0e28daf-a4e4-4dce-b43a-27a779a7e0e3" (UID: "f0e28daf-a4e4-4dce-b43a-27a779a7e0e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.279460 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-config-data" (OuterVolumeSpecName: "config-data") pod "f0e28daf-a4e4-4dce-b43a-27a779a7e0e3" (UID: "f0e28daf-a4e4-4dce-b43a-27a779a7e0e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.279726 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-logs" (OuterVolumeSpecName: "logs") pod "f0e28daf-a4e4-4dce-b43a-27a779a7e0e3" (UID: "f0e28daf-a4e4-4dce-b43a-27a779a7e0e3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.284642 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-kube-api-access-2vz4v" (OuterVolumeSpecName: "kube-api-access-2vz4v") pod "f0e28daf-a4e4-4dce-b43a-27a779a7e0e3" (UID: "f0e28daf-a4e4-4dce-b43a-27a779a7e0e3"). InnerVolumeSpecName "kube-api-access-2vz4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.291482 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f0e28daf-a4e4-4dce-b43a-27a779a7e0e3" (UID: "f0e28daf-a4e4-4dce-b43a-27a779a7e0e3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.312253 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-96f46"] Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.322484 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-96f46"] Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.366587 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.366624 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vz4v\" (UniqueName: \"kubernetes.io/projected/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-kube-api-access-2vz4v\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.366635 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.366645 4935 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.366655 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.615369 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f84dfc49-qqm76"] Dec 17 09:24:34 crc kubenswrapper[4935]: I1217 09:24:34.626097 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f84dfc49-qqm76"] Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.140569 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" path="/var/lib/kubelet/pods/582e42e7-25fe-4a3e-9a76-87f9c62af78c/volumes" Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.141348 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0e28daf-a4e4-4dce-b43a-27a779a7e0e3" path="/var/lib/kubelet/pods/f0e28daf-a4e4-4dce-b43a-27a779a7e0e3/volumes" Dec 17 09:24:35 crc kubenswrapper[4935]: E1217 09:24:35.255609 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack/barbican-db-sync-vhjl6" podUID="a62d0f30-735b-410e-ac80-50a98636ff47" Dec 17 09:24:35 crc kubenswrapper[4935]: E1217 09:24:35.640827 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Dec 17 09:24:35 crc kubenswrapper[4935]: E1217 09:24:35.641834 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x9sct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-f7nmq_openstack(9b17c8be-6039-4aa6-8227-cd2dfc076f77): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:24:35 crc kubenswrapper[4935]: E1217 09:24:35.643584 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-f7nmq" podUID="9b17c8be-6039-4aa6-8227-cd2dfc076f77" Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.797679 4935 scope.go:117] "RemoveContainer" containerID="ae969a21e12d842efa534d2ea406cd7a71b99213044780025d473d6e16ccf39f" Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.859507 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.864318 4935 scope.go:117] "RemoveContainer" containerID="d76653a5dc31a7a0d5987a3874238d7ac5ac68675c001840510d3fd2fe2d7a96" Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.905361 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-combined-ca-bundle\") pod \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.905404 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-config\") pod \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.905549 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc4qj\" (UniqueName: \"kubernetes.io/projected/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-kube-api-access-vc4qj\") pod \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\" (UID: \"99e284fa-5f85-409e-bcb3-fcb2b320a0fe\") " Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.924234 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-kube-api-access-vc4qj" (OuterVolumeSpecName: "kube-api-access-vc4qj") pod "99e284fa-5f85-409e-bcb3-fcb2b320a0fe" (UID: "99e284fa-5f85-409e-bcb3-fcb2b320a0fe"). InnerVolumeSpecName "kube-api-access-vc4qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.941714 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99e284fa-5f85-409e-bcb3-fcb2b320a0fe" (UID: "99e284fa-5f85-409e-bcb3-fcb2b320a0fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:35 crc kubenswrapper[4935]: I1217 09:24:35.958874 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-config" (OuterVolumeSpecName: "config") pod "99e284fa-5f85-409e-bcb3-fcb2b320a0fe" (UID: "99e284fa-5f85-409e-bcb3-fcb2b320a0fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.007438 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc4qj\" (UniqueName: \"kubernetes.io/projected/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-kube-api-access-vc4qj\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.007491 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.007504 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/99e284fa-5f85-409e-bcb3-fcb2b320a0fe-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.188533 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bbfb6547d-64jt7"] Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.273638 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f667bdd5-px4gj" event={"ID":"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc","Type":"ContainerStarted","Data":"9eaac25bcefbb36ab7789b5708c2696617578220b4e144bda5955dac76763fed"} Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.282948 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"65dd41ad94dd7bae1b7cbbd3c318eb23617db601045887b6ac1ed745fc1e5001"} Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.287747 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.288828 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687fb4989-g2n7c" event={"ID":"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7","Type":"ContainerStarted","Data":"14ae31a0284d360b60ba5de97c50ba8651c3950f140b682879e1e00c6dbfa715"} Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.295376 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee747235-93a8-420f-95b6-232cdf8a3223","Type":"ContainerStarted","Data":"33e694debc362dac64987513367abf8519814666ab8bf4d62e9ac950f0001602"} Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.298742 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xk2l5"] Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.301834 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-g8ctb" event={"ID":"99e284fa-5f85-409e-bcb3-fcb2b320a0fe","Type":"ContainerDied","Data":"114a46d70b4ff70246b57dae562b7a1439293e7cfa68363ceb2ad566e7207730"} Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.301874 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="114a46d70b4ff70246b57dae562b7a1439293e7cfa68363ceb2ad566e7207730" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.301944 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-g8ctb" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.314029 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbfb6547d-64jt7" event={"ID":"3658abd7-bc1e-4359-aa8b-011fe7189342","Type":"ContainerStarted","Data":"f12b32c006bd357211e8667cca7abad8e2c8951b5a91562d3c51a553e6ef8987"} Dec 17 09:24:36 crc kubenswrapper[4935]: E1217 09:24:36.319876 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-f7nmq" podUID="9b17c8be-6039-4aa6-8227-cd2dfc076f77" Dec 17 09:24:36 crc kubenswrapper[4935]: W1217 09:24:36.359493 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabc82931_62eb_42ba_952d_9b1c99f2fd25.slice/crio-989501f96fa51bfbca20816f711b5f4ad320a7342a9db935d6aa808294b5d37f WatchSource:0}: Error finding container 989501f96fa51bfbca20816f711b5f4ad320a7342a9db935d6aa808294b5d37f: Status 404 returned error can't find the container with id 989501f96fa51bfbca20816f711b5f4ad320a7342a9db935d6aa808294b5d37f Dec 17 09:24:36 crc kubenswrapper[4935]: W1217 09:24:36.360097 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5385d045_3f7c_447d_8ce8_d12a8de0cdce.slice/crio-b74215db1bd1dd575fe06e49b4959bde1cde54925c011c89f26e1453d3916289 WatchSource:0}: Error finding container b74215db1bd1dd575fe06e49b4959bde1cde54925c011c89f26e1453d3916289: Status 404 returned error can't find the container with id b74215db1bd1dd575fe06e49b4959bde1cde54925c011c89f26e1453d3916289 Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.448968 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54c44548bb-26j2l"] Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.514330 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-knwqw"] Dec 17 09:24:36 crc kubenswrapper[4935]: E1217 09:24:36.518581 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerName="init" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.518600 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerName="init" Dec 17 09:24:36 crc kubenswrapper[4935]: E1217 09:24:36.518631 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerName="dnsmasq-dns" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.518638 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerName="dnsmasq-dns" Dec 17 09:24:36 crc kubenswrapper[4935]: E1217 09:24:36.518651 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e284fa-5f85-409e-bcb3-fcb2b320a0fe" containerName="neutron-db-sync" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.518657 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e284fa-5f85-409e-bcb3-fcb2b320a0fe" containerName="neutron-db-sync" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.518838 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerName="dnsmasq-dns" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.518857 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e284fa-5f85-409e-bcb3-fcb2b320a0fe" containerName="neutron-db-sync" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.564903 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.618935 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.644336 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-knwqw"] Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.660022 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.660112 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx2tw\" (UniqueName: \"kubernetes.io/projected/95378004-259e-4ed5-b3ac-2e7870949a91-kube-api-access-xx2tw\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.660146 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.661137 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-config\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.661343 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.661392 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.766728 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.766798 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-config\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.766876 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.766908 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.766967 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.766994 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2tw\" (UniqueName: \"kubernetes.io/projected/95378004-259e-4ed5-b3ac-2e7870949a91-kube-api-access-xx2tw\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.768377 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.768923 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-config\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.769566 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.770245 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.771022 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.888802 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2tw\" (UniqueName: \"kubernetes.io/projected/95378004-259e-4ed5-b3ac-2e7870949a91-kube-api-access-xx2tw\") pod \"dnsmasq-dns-6b9c8b59c-knwqw\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.952568 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-84f77988b8-57qwv"] Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.954352 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.957789 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.958179 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-45j9c" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.958353 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.971395 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-combined-ca-bundle\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.971496 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-httpd-config\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.971608 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-ovndb-tls-certs\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.971756 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm9q6\" (UniqueName: \"kubernetes.io/projected/6796ebed-95e9-4af3-a76f-0acc484ddbfb-kube-api-access-gm9q6\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.971782 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-config\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:36 crc kubenswrapper[4935]: I1217 09:24:36.974231 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.053294 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84f77988b8-57qwv"] Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.083700 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-httpd-config\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.083841 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-ovndb-tls-certs\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.083898 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm9q6\" (UniqueName: \"kubernetes.io/projected/6796ebed-95e9-4af3-a76f-0acc484ddbfb-kube-api-access-gm9q6\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.083929 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-config\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.083991 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-combined-ca-bundle\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.115169 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-httpd-config\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.120079 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-combined-ca-bundle\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.125786 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-config\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.142993 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.154180 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm9q6\" (UniqueName: \"kubernetes.io/projected/6796ebed-95e9-4af3-a76f-0acc484ddbfb-kube-api-access-gm9q6\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.196986 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-ovndb-tls-certs\") pod \"neutron-84f77988b8-57qwv\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.386602 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xk2l5" event={"ID":"b867f99b-0bea-4d24-88e7-4dc1c1f991e6","Type":"ContainerStarted","Data":"74b18765fd2e15e774860dba087b7765cd444eb7513801fbedd0868947d51bbb"} Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.387551 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.392339 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6","Type":"ContainerStarted","Data":"9a929878615707b4cd4504b26cee0370dab037a81d470423516668afc2cc6c70"} Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.407552 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbfb6547d-64jt7" event={"ID":"3658abd7-bc1e-4359-aa8b-011fe7189342","Type":"ContainerStarted","Data":"a8c0e7848a60d6e20138070716c192e1a3eba83b96c9ac7b24cdf805730765f0"} Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.416160 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f667bdd5-px4gj" event={"ID":"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc","Type":"ContainerStarted","Data":"69bda1557a71e6e1a0ed1d1935c9fc2f652c5445904e049de5ba53ddbc652d93"} Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.416432 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64f667bdd5-px4gj" podUID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" containerName="horizon-log" containerID="cri-o://9eaac25bcefbb36ab7789b5708c2696617578220b4e144bda5955dac76763fed" gracePeriod=30 Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.416965 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-64f667bdd5-px4gj" podUID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" containerName="horizon" containerID="cri-o://69bda1557a71e6e1a0ed1d1935c9fc2f652c5445904e049de5ba53ddbc652d93" gracePeriod=30 Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.422913 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687fb4989-g2n7c" event={"ID":"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7","Type":"ContainerStarted","Data":"32750d33a6bf3e214de395af044fa80bde112e7d6d36df1f148c3380da508b17"} Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.423131 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-687fb4989-g2n7c" podUID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" containerName="horizon-log" containerID="cri-o://14ae31a0284d360b60ba5de97c50ba8651c3950f140b682879e1e00c6dbfa715" gracePeriod=30 Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.423337 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-687fb4989-g2n7c" podUID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" containerName="horizon" containerID="cri-o://32750d33a6bf3e214de395af044fa80bde112e7d6d36df1f148c3380da508b17" gracePeriod=30 Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.446235 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c44548bb-26j2l" event={"ID":"5385d045-3f7c-447d-8ce8-d12a8de0cdce","Type":"ContainerStarted","Data":"26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7"} Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.446313 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c44548bb-26j2l" event={"ID":"5385d045-3f7c-447d-8ce8-d12a8de0cdce","Type":"ContainerStarted","Data":"b74215db1bd1dd575fe06e49b4959bde1cde54925c011c89f26e1453d3916289"} Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.450165 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"abc82931-62eb-42ba-952d-9b1c99f2fd25","Type":"ContainerStarted","Data":"989501f96fa51bfbca20816f711b5f4ad320a7342a9db935d6aa808294b5d37f"} Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.472019 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-64f667bdd5-px4gj" podStartSLOduration=10.07518827 podStartE2EDuration="42.47199335s" podCreationTimestamp="2025-12-17 09:23:55 +0000 UTC" firstStartedPulling="2025-12-17 09:24:03.250992521 +0000 UTC m=+1162.910833284" lastFinishedPulling="2025-12-17 09:24:35.647797591 +0000 UTC m=+1195.307638364" observedRunningTime="2025-12-17 09:24:37.467159162 +0000 UTC m=+1197.126999925" watchObservedRunningTime="2025-12-17 09:24:37.47199335 +0000 UTC m=+1197.131834113" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.499505 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-96f46" podUID="582e42e7-25fe-4a3e-9a76-87f9c62af78c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 17 09:24:37 crc kubenswrapper[4935]: I1217 09:24:37.505133 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-687fb4989-g2n7c" podStartSLOduration=10.484471377 podStartE2EDuration="41.505109679s" podCreationTimestamp="2025-12-17 09:23:56 +0000 UTC" firstStartedPulling="2025-12-17 09:24:03.183932233 +0000 UTC m=+1162.843772996" lastFinishedPulling="2025-12-17 09:24:34.204570535 +0000 UTC m=+1193.864411298" observedRunningTime="2025-12-17 09:24:37.494831138 +0000 UTC m=+1197.154671901" watchObservedRunningTime="2025-12-17 09:24:37.505109679 +0000 UTC m=+1197.164950452" Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.129256 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-knwqw"] Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.339127 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-84f77988b8-57qwv"] Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.479819 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xk2l5" event={"ID":"b867f99b-0bea-4d24-88e7-4dc1c1f991e6","Type":"ContainerStarted","Data":"42c8ce9d58026ac8daf60247c2367152c2c018b6ea58efe28a489e158d919d97"} Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.494237 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f77988b8-57qwv" event={"ID":"6796ebed-95e9-4af3-a76f-0acc484ddbfb","Type":"ContainerStarted","Data":"44e8eda982bccbf6bc6fb8e8e1ba2a567770a6aba1fb17da49c2d1254b5ba516"} Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.504597 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rvtvc" event={"ID":"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0","Type":"ContainerStarted","Data":"3ea148c6b6a672fe1cdb81351986afd65cd19564e042063d59a6a01552c2d554"} Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.508473 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xk2l5" podStartSLOduration=18.508449852 podStartE2EDuration="18.508449852s" podCreationTimestamp="2025-12-17 09:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:38.503695916 +0000 UTC m=+1198.163536679" watchObservedRunningTime="2025-12-17 09:24:38.508449852 +0000 UTC m=+1198.168290615" Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.513481 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" event={"ID":"95378004-259e-4ed5-b3ac-2e7870949a91","Type":"ContainerStarted","Data":"1cd71e05e56d827006399a109755b4ce1e84eb9388d63ccc61a7f52c1bbf76fe"} Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.522458 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbfb6547d-64jt7" event={"ID":"3658abd7-bc1e-4359-aa8b-011fe7189342","Type":"ContainerStarted","Data":"6a3e22864172d577e915adfe89fd5886266135608280872f4a24cbf78156f8d4"} Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.531026 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-rvtvc" podStartSLOduration=8.937152479 podStartE2EDuration="42.531001563s" podCreationTimestamp="2025-12-17 09:23:56 +0000 UTC" firstStartedPulling="2025-12-17 09:24:03.264684595 +0000 UTC m=+1162.924525358" lastFinishedPulling="2025-12-17 09:24:36.858533679 +0000 UTC m=+1196.518374442" observedRunningTime="2025-12-17 09:24:38.528245345 +0000 UTC m=+1198.188086108" watchObservedRunningTime="2025-12-17 09:24:38.531001563 +0000 UTC m=+1198.190842326" Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.539080 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c44548bb-26j2l" event={"ID":"5385d045-3f7c-447d-8ce8-d12a8de0cdce","Type":"ContainerStarted","Data":"0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1"} Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.547010 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"abc82931-62eb-42ba-952d-9b1c99f2fd25","Type":"ContainerStarted","Data":"b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674"} Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.584546 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7bbfb6547d-64jt7" podStartSLOduration=30.5845275 podStartE2EDuration="30.5845275s" podCreationTimestamp="2025-12-17 09:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:38.560848631 +0000 UTC m=+1198.220689394" watchObservedRunningTime="2025-12-17 09:24:38.5845275 +0000 UTC m=+1198.244368263" Dec 17 09:24:38 crc kubenswrapper[4935]: I1217 09:24:38.602352 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54c44548bb-26j2l" podStartSLOduration=30.602330745 podStartE2EDuration="30.602330745s" podCreationTimestamp="2025-12-17 09:24:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:38.582991922 +0000 UTC m=+1198.242832685" watchObservedRunningTime="2025-12-17 09:24:38.602330745 +0000 UTC m=+1198.262171528" Dec 17 09:24:39 crc kubenswrapper[4935]: I1217 09:24:39.064041 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:39 crc kubenswrapper[4935]: I1217 09:24:39.065453 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:24:39 crc kubenswrapper[4935]: I1217 09:24:39.231459 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:39 crc kubenswrapper[4935]: I1217 09:24:39.231539 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:24:39 crc kubenswrapper[4935]: I1217 09:24:39.571236 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6","Type":"ContainerStarted","Data":"fb66ef7f61349a9a3e96e72f5c8de675bcd402e7dcba7184f7710ec6b8f5cc16"} Dec 17 09:24:39 crc kubenswrapper[4935]: I1217 09:24:39.573091 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f77988b8-57qwv" event={"ID":"6796ebed-95e9-4af3-a76f-0acc484ddbfb","Type":"ContainerStarted","Data":"3530ad080d1edcab525f47d2d9a66c1e08f558003b8a420933927b17595a5589"} Dec 17 09:24:39 crc kubenswrapper[4935]: I1217 09:24:39.574466 4935 generic.go:334] "Generic (PLEG): container finished" podID="95378004-259e-4ed5-b3ac-2e7870949a91" containerID="e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647" exitCode=0 Dec 17 09:24:39 crc kubenswrapper[4935]: I1217 09:24:39.575960 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" event={"ID":"95378004-259e-4ed5-b3ac-2e7870949a91","Type":"ContainerDied","Data":"e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647"} Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.396134 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7c8457c6df-5qkkl"] Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.398532 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.405964 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.406134 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.445987 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-ovndb-tls-certs\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.446051 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-config\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.446081 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-combined-ca-bundle\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.446104 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-public-tls-certs\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.446172 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-internal-tls-certs\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.446232 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-httpd-config\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.446249 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jrd5\" (UniqueName: \"kubernetes.io/projected/67591c00-7d49-4db4-af34-8901c57dbb0b-kube-api-access-2jrd5\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.452076 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c8457c6df-5qkkl"] Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.547048 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-internal-tls-certs\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.547150 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-httpd-config\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.547180 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jrd5\" (UniqueName: \"kubernetes.io/projected/67591c00-7d49-4db4-af34-8901c57dbb0b-kube-api-access-2jrd5\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.547261 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-ovndb-tls-certs\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.547312 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-config\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.547336 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-combined-ca-bundle\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.547358 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-public-tls-certs\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.556499 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-combined-ca-bundle\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.557531 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-httpd-config\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.557881 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-config\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.557996 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-ovndb-tls-certs\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.558465 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-internal-tls-certs\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.560759 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/67591c00-7d49-4db4-af34-8901c57dbb0b-public-tls-certs\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.581793 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jrd5\" (UniqueName: \"kubernetes.io/projected/67591c00-7d49-4db4-af34-8901c57dbb0b-kube-api-access-2jrd5\") pod \"neutron-7c8457c6df-5qkkl\" (UID: \"67591c00-7d49-4db4-af34-8901c57dbb0b\") " pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.607918 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" event={"ID":"95378004-259e-4ed5-b3ac-2e7870949a91","Type":"ContainerStarted","Data":"4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70"} Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.621380 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"abc82931-62eb-42ba-952d-9b1c99f2fd25","Type":"ContainerStarted","Data":"f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca"} Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.630666 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6","Type":"ContainerStarted","Data":"60a63cdd5db0c8a1a4f78de46ef7939045b1c4415629aba4105851cdee5b0b57"} Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.631321 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" containerName="glance-log" containerID="cri-o://fb66ef7f61349a9a3e96e72f5c8de675bcd402e7dcba7184f7710ec6b8f5cc16" gracePeriod=30 Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.631417 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" containerName="glance-httpd" containerID="cri-o://60a63cdd5db0c8a1a4f78de46ef7939045b1c4415629aba4105851cdee5b0b57" gracePeriod=30 Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.645428 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f77988b8-57qwv" event={"ID":"6796ebed-95e9-4af3-a76f-0acc484ddbfb","Type":"ContainerStarted","Data":"780433440bfe8714d3a013717e8a34ddf923ead49f1825269886dcb6621a5661"} Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.645485 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.671067 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=31.671030514 podStartE2EDuration="31.671030514s" podCreationTimestamp="2025-12-17 09:24:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:40.666092994 +0000 UTC m=+1200.325933937" watchObservedRunningTime="2025-12-17 09:24:40.671030514 +0000 UTC m=+1200.330871277" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.706007 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=33.705985768 podStartE2EDuration="33.705985768s" podCreationTimestamp="2025-12-17 09:24:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:40.696926857 +0000 UTC m=+1200.356767620" watchObservedRunningTime="2025-12-17 09:24:40.705985768 +0000 UTC m=+1200.365826521" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.728005 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-84f77988b8-57qwv" podStartSLOduration=4.727964905 podStartE2EDuration="4.727964905s" podCreationTimestamp="2025-12-17 09:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:40.727667098 +0000 UTC m=+1200.387507861" watchObservedRunningTime="2025-12-17 09:24:40.727964905 +0000 UTC m=+1200.387805668" Dec 17 09:24:40 crc kubenswrapper[4935]: I1217 09:24:40.754768 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:41 crc kubenswrapper[4935]: I1217 09:24:41.672906 4935 generic.go:334] "Generic (PLEG): container finished" podID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" containerID="60a63cdd5db0c8a1a4f78de46ef7939045b1c4415629aba4105851cdee5b0b57" exitCode=0 Dec 17 09:24:41 crc kubenswrapper[4935]: I1217 09:24:41.673518 4935 generic.go:334] "Generic (PLEG): container finished" podID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" containerID="fb66ef7f61349a9a3e96e72f5c8de675bcd402e7dcba7184f7710ec6b8f5cc16" exitCode=143 Dec 17 09:24:41 crc kubenswrapper[4935]: I1217 09:24:41.672957 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6","Type":"ContainerDied","Data":"60a63cdd5db0c8a1a4f78de46ef7939045b1c4415629aba4105851cdee5b0b57"} Dec 17 09:24:41 crc kubenswrapper[4935]: I1217 09:24:41.673574 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6","Type":"ContainerDied","Data":"fb66ef7f61349a9a3e96e72f5c8de675bcd402e7dcba7184f7710ec6b8f5cc16"} Dec 17 09:24:43 crc kubenswrapper[4935]: I1217 09:24:41.712771 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" podStartSLOduration=5.712748285 podStartE2EDuration="5.712748285s" podCreationTimestamp="2025-12-17 09:24:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:41.700003284 +0000 UTC m=+1201.359844047" watchObservedRunningTime="2025-12-17 09:24:41.712748285 +0000 UTC m=+1201.372589068" Dec 17 09:24:43 crc kubenswrapper[4935]: I1217 09:24:42.087121 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7c8457c6df-5qkkl"] Dec 17 09:24:43 crc kubenswrapper[4935]: I1217 09:24:42.143981 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:43 crc kubenswrapper[4935]: I1217 09:24:42.692438 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c8457c6df-5qkkl" event={"ID":"67591c00-7d49-4db4-af34-8901c57dbb0b","Type":"ContainerStarted","Data":"28fe6e4712c10a9a5b064628264a9b86a17701332c746dd567d1ceb3115531f6"} Dec 17 09:24:43 crc kubenswrapper[4935]: I1217 09:24:42.696978 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee747235-93a8-420f-95b6-232cdf8a3223","Type":"ContainerStarted","Data":"0ba2fc44e91ee7ba44310f25eaf57f763e280313b2746fec7830e9ec68e45894"} Dec 17 09:24:43 crc kubenswrapper[4935]: I1217 09:24:43.713372 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c8457c6df-5qkkl" event={"ID":"67591c00-7d49-4db4-af34-8901c57dbb0b","Type":"ContainerStarted","Data":"693d1d185d267ff6b6872cda24fcc95bb56fc05d29465b900bd8b3c52e4e6425"} Dec 17 09:24:43 crc kubenswrapper[4935]: I1217 09:24:43.714174 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7c8457c6df-5qkkl" event={"ID":"67591c00-7d49-4db4-af34-8901c57dbb0b","Type":"ContainerStarted","Data":"a431654613baf76d34c96cabdf1af8ee0950c714df7563230d4efbd0554435c3"} Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.246327 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.291086 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-config-data\") pod \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.291613 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-scripts\") pod \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.291959 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpjh7\" (UniqueName: \"kubernetes.io/projected/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-kube-api-access-kpjh7\") pod \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.292024 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-logs\") pod \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.292085 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-httpd-run\") pod \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.292117 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.292201 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-combined-ca-bundle\") pod \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\" (UID: \"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6\") " Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.293573 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" (UID: "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.293916 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-logs" (OuterVolumeSpecName: "logs") pod "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" (UID: "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.300399 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-scripts" (OuterVolumeSpecName: "scripts") pod "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" (UID: "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.301112 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-kube-api-access-kpjh7" (OuterVolumeSpecName: "kube-api-access-kpjh7") pod "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" (UID: "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6"). InnerVolumeSpecName "kube-api-access-kpjh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.313651 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" (UID: "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.328521 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" (UID: "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.368734 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-config-data" (OuterVolumeSpecName: "config-data") pod "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" (UID: "8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.400320 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.400390 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.400401 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpjh7\" (UniqueName: \"kubernetes.io/projected/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-kube-api-access-kpjh7\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.400441 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.400450 4935 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.400521 4935 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.400538 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.426651 4935 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.502842 4935 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.728717 4935 generic.go:334] "Generic (PLEG): container finished" podID="b867f99b-0bea-4d24-88e7-4dc1c1f991e6" containerID="42c8ce9d58026ac8daf60247c2367152c2c018b6ea58efe28a489e158d919d97" exitCode=0 Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.728812 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xk2l5" event={"ID":"b867f99b-0bea-4d24-88e7-4dc1c1f991e6","Type":"ContainerDied","Data":"42c8ce9d58026ac8daf60247c2367152c2c018b6ea58efe28a489e158d919d97"} Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.734718 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6","Type":"ContainerDied","Data":"9a929878615707b4cd4504b26cee0370dab037a81d470423516668afc2cc6c70"} Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.734802 4935 scope.go:117] "RemoveContainer" containerID="60a63cdd5db0c8a1a4f78de46ef7939045b1c4415629aba4105851cdee5b0b57" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.735054 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.743380 4935 generic.go:334] "Generic (PLEG): container finished" podID="0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" containerID="3ea148c6b6a672fe1cdb81351986afd65cd19564e042063d59a6a01552c2d554" exitCode=0 Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.743456 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rvtvc" event={"ID":"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0","Type":"ContainerDied","Data":"3ea148c6b6a672fe1cdb81351986afd65cd19564e042063d59a6a01552c2d554"} Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.743720 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.773631 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7c8457c6df-5qkkl" podStartSLOduration=4.773581966 podStartE2EDuration="4.773581966s" podCreationTimestamp="2025-12-17 09:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:44.767418365 +0000 UTC m=+1204.427259128" watchObservedRunningTime="2025-12-17 09:24:44.773581966 +0000 UTC m=+1204.433422729" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.795902 4935 scope.go:117] "RemoveContainer" containerID="fb66ef7f61349a9a3e96e72f5c8de675bcd402e7dcba7184f7710ec6b8f5cc16" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.847693 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.870881 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.885224 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:44 crc kubenswrapper[4935]: E1217 09:24:44.885800 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" containerName="glance-log" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.885820 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" containerName="glance-log" Dec 17 09:24:44 crc kubenswrapper[4935]: E1217 09:24:44.885856 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" containerName="glance-httpd" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.885867 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" containerName="glance-httpd" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.886069 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" containerName="glance-httpd" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.886112 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" containerName="glance-log" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.887432 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.893116 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.894335 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 17 09:24:44 crc kubenswrapper[4935]: I1217 09:24:44.895396 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.017025 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.017108 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.017147 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkj7z\" (UniqueName: \"kubernetes.io/projected/99a5eb5b-829d-49da-a3d1-770d615b9c5a-kube-api-access-xkj7z\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.017181 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.017494 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.017691 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.017749 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.017764 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.120491 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.120976 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.121006 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.121160 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.121202 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.121257 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkj7z\" (UniqueName: \"kubernetes.io/projected/99a5eb5b-829d-49da-a3d1-770d615b9c5a-kube-api-access-xkj7z\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.121334 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.121428 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.121859 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.122209 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.122544 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-logs\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.127201 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.135994 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.147721 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.149871 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.173430 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6" path="/var/lib/kubelet/pods/8cca130e-5dbc-4edb-b0d9-f04a0ecd2ea6/volumes" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.174334 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkj7z\" (UniqueName: \"kubernetes.io/projected/99a5eb5b-829d-49da-a3d1-770d615b9c5a-kube-api-access-xkj7z\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.212857 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.242697 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:45 crc kubenswrapper[4935]: I1217 09:24:45.931591 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.290757 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rvtvc" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.293381 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461148 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-scripts\") pod \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461216 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-combined-ca-bundle\") pod \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461245 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-fernet-keys\") pod \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461297 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-config-data\") pod \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461351 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-combined-ca-bundle\") pod \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461422 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-config-data\") pod \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461540 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-credential-keys\") pod \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461623 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-logs\") pod \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461702 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htqvd\" (UniqueName: \"kubernetes.io/projected/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-kube-api-access-htqvd\") pod \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\" (UID: \"b867f99b-0bea-4d24-88e7-4dc1c1f991e6\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461765 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-scripts\") pod \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.461798 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8dg8\" (UniqueName: \"kubernetes.io/projected/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-kube-api-access-w8dg8\") pod \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\" (UID: \"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0\") " Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.462729 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-logs" (OuterVolumeSpecName: "logs") pod "0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" (UID: "0b52e650-9c70-4617-9fbb-12fbb5a1c3e0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.474395 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-scripts" (OuterVolumeSpecName: "scripts") pod "b867f99b-0bea-4d24-88e7-4dc1c1f991e6" (UID: "b867f99b-0bea-4d24-88e7-4dc1c1f991e6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.474728 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b867f99b-0bea-4d24-88e7-4dc1c1f991e6" (UID: "b867f99b-0bea-4d24-88e7-4dc1c1f991e6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.476459 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "b867f99b-0bea-4d24-88e7-4dc1c1f991e6" (UID: "b867f99b-0bea-4d24-88e7-4dc1c1f991e6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.478883 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-kube-api-access-w8dg8" (OuterVolumeSpecName: "kube-api-access-w8dg8") pod "0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" (UID: "0b52e650-9c70-4617-9fbb-12fbb5a1c3e0"). InnerVolumeSpecName "kube-api-access-w8dg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.480410 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-kube-api-access-htqvd" (OuterVolumeSpecName: "kube-api-access-htqvd") pod "b867f99b-0bea-4d24-88e7-4dc1c1f991e6" (UID: "b867f99b-0bea-4d24-88e7-4dc1c1f991e6"). InnerVolumeSpecName "kube-api-access-htqvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.502478 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-scripts" (OuterVolumeSpecName: "scripts") pod "0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" (UID: "0b52e650-9c70-4617-9fbb-12fbb5a1c3e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.527660 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-config-data" (OuterVolumeSpecName: "config-data") pod "b867f99b-0bea-4d24-88e7-4dc1c1f991e6" (UID: "b867f99b-0bea-4d24-88e7-4dc1c1f991e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.531371 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-config-data" (OuterVolumeSpecName: "config-data") pod "0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" (UID: "0b52e650-9c70-4617-9fbb-12fbb5a1c3e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.533114 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" (UID: "0b52e650-9c70-4617-9fbb-12fbb5a1c3e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.554375 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b867f99b-0bea-4d24-88e7-4dc1c1f991e6" (UID: "b867f99b-0bea-4d24-88e7-4dc1c1f991e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565560 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htqvd\" (UniqueName: \"kubernetes.io/projected/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-kube-api-access-htqvd\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565608 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565627 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8dg8\" (UniqueName: \"kubernetes.io/projected/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-kube-api-access-w8dg8\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565639 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565653 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565663 4935 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565675 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565707 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565720 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565730 4935 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b867f99b-0bea-4d24-88e7-4dc1c1f991e6-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.565741 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.722034 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.778555 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99a5eb5b-829d-49da-a3d1-770d615b9c5a","Type":"ContainerStarted","Data":"f8b4883fe673e97960fc30362e9a5b303f662f818179df8bf170a5e8b24422e5"} Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.790320 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xk2l5" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.792572 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xk2l5" event={"ID":"b867f99b-0bea-4d24-88e7-4dc1c1f991e6","Type":"ContainerDied","Data":"74b18765fd2e15e774860dba087b7765cd444eb7513801fbedd0868947d51bbb"} Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.792652 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74b18765fd2e15e774860dba087b7765cd444eb7513801fbedd0868947d51bbb" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.803943 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-rvtvc" event={"ID":"0b52e650-9c70-4617-9fbb-12fbb5a1c3e0","Type":"ContainerDied","Data":"52b728d267838ea5355f4043d93c8d4d7d483e4b5f5dd66b4b55d1dc1b0c1bde"} Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.804201 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52b728d267838ea5355f4043d93c8d4d7d483e4b5f5dd66b4b55d1dc1b0c1bde" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.804363 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-rvtvc" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.889431 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-548ff6dcf4-brlq9"] Dec 17 09:24:46 crc kubenswrapper[4935]: E1217 09:24:46.890426 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" containerName="placement-db-sync" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.890449 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" containerName="placement-db-sync" Dec 17 09:24:46 crc kubenswrapper[4935]: E1217 09:24:46.890471 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b867f99b-0bea-4d24-88e7-4dc1c1f991e6" containerName="keystone-bootstrap" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.890486 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="b867f99b-0bea-4d24-88e7-4dc1c1f991e6" containerName="keystone-bootstrap" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.890738 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" containerName="placement-db-sync" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.890758 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="b867f99b-0bea-4d24-88e7-4dc1c1f991e6" containerName="keystone-bootstrap" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.891555 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.895332 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.895682 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.895825 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.904108 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.904557 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-bnktl" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.904710 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.925891 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-548ff6dcf4-brlq9"] Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.980538 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-combined-ca-bundle\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.980638 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-config-data\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.980689 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjtqj\" (UniqueName: \"kubernetes.io/projected/b3785d8e-a1d0-41db-81df-41ba57d019e5-kube-api-access-zjtqj\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.980730 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-scripts\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.980758 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-internal-tls-certs\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.980812 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-fernet-keys\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.980832 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-public-tls-certs\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:46 crc kubenswrapper[4935]: I1217 09:24:46.980878 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-credential-keys\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.046414 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.087641 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-combined-ca-bundle\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.087755 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-config-data\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.087818 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjtqj\" (UniqueName: \"kubernetes.io/projected/b3785d8e-a1d0-41db-81df-41ba57d019e5-kube-api-access-zjtqj\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.087853 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-scripts\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.087887 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-internal-tls-certs\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.087945 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-fernet-keys\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.087972 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-public-tls-certs\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.088013 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-credential-keys\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.099506 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-public-tls-certs\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.101856 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-combined-ca-bundle\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.103025 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-fernet-keys\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.105982 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-internal-tls-certs\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.107435 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-config-data\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.107992 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-credential-keys\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.117529 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3785d8e-a1d0-41db-81df-41ba57d019e5-scripts\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.138150 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjtqj\" (UniqueName: \"kubernetes.io/projected/b3785d8e-a1d0-41db-81df-41ba57d019e5-kube-api-access-zjtqj\") pod \"keystone-548ff6dcf4-brlq9\" (UID: \"b3785d8e-a1d0-41db-81df-41ba57d019e5\") " pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.187785 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.207792 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5fc9d9db96-gh2fm"] Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.210624 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.224177 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-tmblm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.224830 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.225006 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.225159 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.225353 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.246898 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.293436 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5fc9d9db96-gh2fm"] Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.305131 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjjf\" (UniqueName: \"kubernetes.io/projected/f4511340-5f20-486c-b7c3-2e4b04f60a14-kube-api-access-hkjjf\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.305315 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-combined-ca-bundle\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.305356 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-config-data\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.305417 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-scripts\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.305444 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4511340-5f20-486c-b7c3-2e4b04f60a14-logs\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.305465 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-internal-tls-certs\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.305521 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-public-tls-certs\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.380250 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-nl2wv"] Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.380578 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" podUID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" containerName="dnsmasq-dns" containerID="cri-o://2dcc04ce65f67c91261b2221371e56d87a43818d65b35702eb57dd40dcae49f2" gracePeriod=10 Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.407804 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-public-tls-certs\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.407899 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjjf\" (UniqueName: \"kubernetes.io/projected/f4511340-5f20-486c-b7c3-2e4b04f60a14-kube-api-access-hkjjf\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.407967 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-combined-ca-bundle\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.408001 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-config-data\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.408042 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-scripts\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.408066 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4511340-5f20-486c-b7c3-2e4b04f60a14-logs\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.408085 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-internal-tls-certs\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.417080 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-combined-ca-bundle\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.418731 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4511340-5f20-486c-b7c3-2e4b04f60a14-logs\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.418750 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-public-tls-certs\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.419096 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" podUID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: connect: connection refused" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.421025 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-scripts\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.428185 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-internal-tls-certs\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.445953 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4511340-5f20-486c-b7c3-2e4b04f60a14-config-data\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.449622 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjjf\" (UniqueName: \"kubernetes.io/projected/f4511340-5f20-486c-b7c3-2e4b04f60a14-kube-api-access-hkjjf\") pod \"placement-5fc9d9db96-gh2fm\" (UID: \"f4511340-5f20-486c-b7c3-2e4b04f60a14\") " pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.579912 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.829381 4935 generic.go:334] "Generic (PLEG): container finished" podID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" containerID="2dcc04ce65f67c91261b2221371e56d87a43818d65b35702eb57dd40dcae49f2" exitCode=0 Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.830108 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" event={"ID":"fe048f60-c05c-4561-88bc-42d1b9eecd6c","Type":"ContainerDied","Data":"2dcc04ce65f67c91261b2221371e56d87a43818d65b35702eb57dd40dcae49f2"} Dec 17 09:24:47 crc kubenswrapper[4935]: I1217 09:24:47.834974 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99a5eb5b-829d-49da-a3d1-770d615b9c5a","Type":"ContainerStarted","Data":"32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d"} Dec 17 09:24:49 crc kubenswrapper[4935]: I1217 09:24:49.065833 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54c44548bb-26j2l" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 17 09:24:49 crc kubenswrapper[4935]: I1217 09:24:49.232043 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7bbfb6547d-64jt7" podUID="3658abd7-bc1e-4359-aa8b-011fe7189342" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 17 09:24:50 crc kubenswrapper[4935]: I1217 09:24:50.392831 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 17 09:24:50 crc kubenswrapper[4935]: I1217 09:24:50.393450 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 17 09:24:50 crc kubenswrapper[4935]: I1217 09:24:50.432304 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 17 09:24:50 crc kubenswrapper[4935]: I1217 09:24:50.496525 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 17 09:24:50 crc kubenswrapper[4935]: I1217 09:24:50.875814 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 17 09:24:50 crc kubenswrapper[4935]: I1217 09:24:50.875863 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 17 09:24:52 crc kubenswrapper[4935]: I1217 09:24:52.417863 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" podUID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.138:5353: connect: connection refused" Dec 17 09:24:52 crc kubenswrapper[4935]: I1217 09:24:52.951759 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.072727 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-nb\") pod \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.072825 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-svc\") pod \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.072981 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-swift-storage-0\") pod \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.073061 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-sb\") pod \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.073089 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgqmg\" (UniqueName: \"kubernetes.io/projected/fe048f60-c05c-4561-88bc-42d1b9eecd6c-kube-api-access-sgqmg\") pod \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.073230 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-config\") pod \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\" (UID: \"fe048f60-c05c-4561-88bc-42d1b9eecd6c\") " Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.090619 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe048f60-c05c-4561-88bc-42d1b9eecd6c-kube-api-access-sgqmg" (OuterVolumeSpecName: "kube-api-access-sgqmg") pod "fe048f60-c05c-4561-88bc-42d1b9eecd6c" (UID: "fe048f60-c05c-4561-88bc-42d1b9eecd6c"). InnerVolumeSpecName "kube-api-access-sgqmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.180414 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-config" (OuterVolumeSpecName: "config") pod "fe048f60-c05c-4561-88bc-42d1b9eecd6c" (UID: "fe048f60-c05c-4561-88bc-42d1b9eecd6c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.183053 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgqmg\" (UniqueName: \"kubernetes.io/projected/fe048f60-c05c-4561-88bc-42d1b9eecd6c-kube-api-access-sgqmg\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.183093 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.206641 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-548ff6dcf4-brlq9"] Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.214366 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5fc9d9db96-gh2fm"] Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.391071 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe048f60-c05c-4561-88bc-42d1b9eecd6c" (UID: "fe048f60-c05c-4561-88bc-42d1b9eecd6c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.432971 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fe048f60-c05c-4561-88bc-42d1b9eecd6c" (UID: "fe048f60-c05c-4561-88bc-42d1b9eecd6c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.447076 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe048f60-c05c-4561-88bc-42d1b9eecd6c" (UID: "fe048f60-c05c-4561-88bc-42d1b9eecd6c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.458687 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe048f60-c05c-4561-88bc-42d1b9eecd6c" (UID: "fe048f60-c05c-4561-88bc-42d1b9eecd6c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.504721 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.504768 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.504782 4935 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.504791 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe048f60-c05c-4561-88bc-42d1b9eecd6c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.564393 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.564515 4935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.565872 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.987068 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.987172 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5dc4fcdbc-nl2wv" event={"ID":"fe048f60-c05c-4561-88bc-42d1b9eecd6c","Type":"ContainerDied","Data":"c092f40bfeea17a4782fdd221416617988c696650416e92c6f8d0324413851f8"} Dec 17 09:24:53 crc kubenswrapper[4935]: I1217 09:24:53.988431 4935 scope.go:117] "RemoveContainer" containerID="2dcc04ce65f67c91261b2221371e56d87a43818d65b35702eb57dd40dcae49f2" Dec 17 09:24:54 crc kubenswrapper[4935]: I1217 09:24:54.000496 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vhjl6" event={"ID":"a62d0f30-735b-410e-ac80-50a98636ff47","Type":"ContainerStarted","Data":"99d2e1e9d43e2b0a5452b48ccbdb667d894a00115adaa79add2f82ae5064307e"} Dec 17 09:24:54 crc kubenswrapper[4935]: I1217 09:24:54.034406 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-vhjl6" podStartSLOduration=7.955850385 podStartE2EDuration="58.034375919s" podCreationTimestamp="2025-12-17 09:23:56 +0000 UTC" firstStartedPulling="2025-12-17 09:24:02.729632598 +0000 UTC m=+1162.389473361" lastFinishedPulling="2025-12-17 09:24:52.808158132 +0000 UTC m=+1212.467998895" observedRunningTime="2025-12-17 09:24:54.031942719 +0000 UTC m=+1213.691783482" watchObservedRunningTime="2025-12-17 09:24:54.034375919 +0000 UTC m=+1213.694216682" Dec 17 09:24:54 crc kubenswrapper[4935]: I1217 09:24:54.037107 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99a5eb5b-829d-49da-a3d1-770d615b9c5a","Type":"ContainerStarted","Data":"9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280"} Dec 17 09:24:54 crc kubenswrapper[4935]: I1217 09:24:54.054755 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fc9d9db96-gh2fm" event={"ID":"f4511340-5f20-486c-b7c3-2e4b04f60a14","Type":"ContainerStarted","Data":"79b0e6bf8730942c9c2ba3ad3997ce5a3dce51a8b829b7feb59c10d09acd44e3"} Dec 17 09:24:54 crc kubenswrapper[4935]: I1217 09:24:54.079002 4935 scope.go:117] "RemoveContainer" containerID="c7de911a37b6d5b1f5a2f4b12c118eaf3ac99c74e315beb0133a888b909c6748" Dec 17 09:24:54 crc kubenswrapper[4935]: I1217 09:24:54.090209 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee747235-93a8-420f-95b6-232cdf8a3223","Type":"ContainerStarted","Data":"b59fef636375afb6a8b1c06ccfd489e69629eb432f6376cb2e3c477ba0298ae5"} Dec 17 09:24:54 crc kubenswrapper[4935]: I1217 09:24:54.109502 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=10.109473062 podStartE2EDuration="10.109473062s" podCreationTimestamp="2025-12-17 09:24:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:54.090739955 +0000 UTC m=+1213.750580718" watchObservedRunningTime="2025-12-17 09:24:54.109473062 +0000 UTC m=+1213.769313825" Dec 17 09:24:54 crc kubenswrapper[4935]: I1217 09:24:54.121235 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-548ff6dcf4-brlq9" event={"ID":"b3785d8e-a1d0-41db-81df-41ba57d019e5","Type":"ContainerStarted","Data":"1e331283925b0626b757b2fa243e1e7b5309eb7218fde6cec0fb4b863e232973"} Dec 17 09:24:54 crc kubenswrapper[4935]: I1217 09:24:54.190366 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-nl2wv"] Dec 17 09:24:54 crc kubenswrapper[4935]: I1217 09:24:54.215205 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-nl2wv"] Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.135186 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" path="/var/lib/kubelet/pods/fe048f60-c05c-4561-88bc-42d1b9eecd6c/volumes" Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.139233 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fc9d9db96-gh2fm" event={"ID":"f4511340-5f20-486c-b7c3-2e4b04f60a14","Type":"ContainerStarted","Data":"e7b5bb36d9137be250de704f301070d64fb0220cc811c918ad1bc26168968882"} Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.139314 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5fc9d9db96-gh2fm" event={"ID":"f4511340-5f20-486c-b7c3-2e4b04f60a14","Type":"ContainerStarted","Data":"09ffdf26594c9746ded02ab70caed3c55a951dd778d4598eb51e523ff5c1aff0"} Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.139339 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.139360 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.140755 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f7nmq" event={"ID":"9b17c8be-6039-4aa6-8227-cd2dfc076f77","Type":"ContainerStarted","Data":"9f29fe5d93c405192589ace7b8e38b12b14a17c2766dff63747094e4a9a117ce"} Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.142550 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-548ff6dcf4-brlq9" event={"ID":"b3785d8e-a1d0-41db-81df-41ba57d019e5","Type":"ContainerStarted","Data":"c95c4a1570ae16c9c72c0a54f21327619bb0f19998c0b1edd7c8a64386ccce76"} Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.142772 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.168397 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5fc9d9db96-gh2fm" podStartSLOduration=8.168375563 podStartE2EDuration="8.168375563s" podCreationTimestamp="2025-12-17 09:24:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:55.162689204 +0000 UTC m=+1214.822529967" watchObservedRunningTime="2025-12-17 09:24:55.168375563 +0000 UTC m=+1214.828216316" Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.190505 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-548ff6dcf4-brlq9" podStartSLOduration=9.190482182 podStartE2EDuration="9.190482182s" podCreationTimestamp="2025-12-17 09:24:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:24:55.187685644 +0000 UTC m=+1214.847526427" watchObservedRunningTime="2025-12-17 09:24:55.190482182 +0000 UTC m=+1214.850322945" Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.222340 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-f7nmq" podStartSLOduration=9.66155754 podStartE2EDuration="59.222307339s" podCreationTimestamp="2025-12-17 09:23:56 +0000 UTC" firstStartedPulling="2025-12-17 09:24:03.249461183 +0000 UTC m=+1162.909301946" lastFinishedPulling="2025-12-17 09:24:52.810210982 +0000 UTC m=+1212.470051745" observedRunningTime="2025-12-17 09:24:55.214686584 +0000 UTC m=+1214.874527357" watchObservedRunningTime="2025-12-17 09:24:55.222307339 +0000 UTC m=+1214.882148102" Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.244364 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.244418 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.281889 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:55 crc kubenswrapper[4935]: I1217 09:24:55.306926 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:56 crc kubenswrapper[4935]: I1217 09:24:56.170481 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:56 crc kubenswrapper[4935]: I1217 09:24:56.170889 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:57 crc kubenswrapper[4935]: I1217 09:24:57.189323 4935 generic.go:334] "Generic (PLEG): container finished" podID="a62d0f30-735b-410e-ac80-50a98636ff47" containerID="99d2e1e9d43e2b0a5452b48ccbdb667d894a00115adaa79add2f82ae5064307e" exitCode=0 Dec 17 09:24:57 crc kubenswrapper[4935]: I1217 09:24:57.189417 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vhjl6" event={"ID":"a62d0f30-735b-410e-ac80-50a98636ff47","Type":"ContainerDied","Data":"99d2e1e9d43e2b0a5452b48ccbdb667d894a00115adaa79add2f82ae5064307e"} Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.199881 4935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.660401 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.785924 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4gz8\" (UniqueName: \"kubernetes.io/projected/a62d0f30-735b-410e-ac80-50a98636ff47-kube-api-access-z4gz8\") pod \"a62d0f30-735b-410e-ac80-50a98636ff47\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.785973 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-combined-ca-bundle\") pod \"a62d0f30-735b-410e-ac80-50a98636ff47\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.786127 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-db-sync-config-data\") pod \"a62d0f30-735b-410e-ac80-50a98636ff47\" (UID: \"a62d0f30-735b-410e-ac80-50a98636ff47\") " Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.795168 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a62d0f30-735b-410e-ac80-50a98636ff47" (UID: "a62d0f30-735b-410e-ac80-50a98636ff47"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.795346 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62d0f30-735b-410e-ac80-50a98636ff47-kube-api-access-z4gz8" (OuterVolumeSpecName: "kube-api-access-z4gz8") pod "a62d0f30-735b-410e-ac80-50a98636ff47" (UID: "a62d0f30-735b-410e-ac80-50a98636ff47"). InnerVolumeSpecName "kube-api-access-z4gz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.820346 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a62d0f30-735b-410e-ac80-50a98636ff47" (UID: "a62d0f30-735b-410e-ac80-50a98636ff47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.889172 4935 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.889225 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4gz8\" (UniqueName: \"kubernetes.io/projected/a62d0f30-735b-410e-ac80-50a98636ff47-kube-api-access-z4gz8\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:58 crc kubenswrapper[4935]: I1217 09:24:58.889239 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a62d0f30-735b-410e-ac80-50a98636ff47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.063348 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54c44548bb-26j2l" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.211910 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-vhjl6" event={"ID":"a62d0f30-735b-410e-ac80-50a98636ff47","Type":"ContainerDied","Data":"146a5d0345e2e6feddae3f48799ff20c4811440e1a611f4126bce480d2675319"} Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.211960 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="146a5d0345e2e6feddae3f48799ff20c4811440e1a611f4126bce480d2675319" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.212028 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-vhjl6" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.231174 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7bbfb6547d-64jt7" podUID="3658abd7-bc1e-4359-aa8b-011fe7189342" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.331392 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.427479 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.464953 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-69fff79bd9-55nbl"] Dec 17 09:24:59 crc kubenswrapper[4935]: E1217 09:24:59.476445 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" containerName="dnsmasq-dns" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.476490 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" containerName="dnsmasq-dns" Dec 17 09:24:59 crc kubenswrapper[4935]: E1217 09:24:59.476518 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" containerName="init" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.476526 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" containerName="init" Dec 17 09:24:59 crc kubenswrapper[4935]: E1217 09:24:59.476547 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62d0f30-735b-410e-ac80-50a98636ff47" containerName="barbican-db-sync" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.476555 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62d0f30-735b-410e-ac80-50a98636ff47" containerName="barbican-db-sync" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.476907 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="a62d0f30-735b-410e-ac80-50a98636ff47" containerName="barbican-db-sync" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.476946 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe048f60-c05c-4561-88bc-42d1b9eecd6c" containerName="dnsmasq-dns" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.478241 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.482029 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.482614 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zw5sc" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.487755 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.525017 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69fff79bd9-55nbl"] Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.570352 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs"] Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.572295 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.578603 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.597980 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs"] Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.608052 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-config-data-custom\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.608422 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-logs\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.608521 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-combined-ca-bundle\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.608609 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8kwj\" (UniqueName: \"kubernetes.io/projected/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-kube-api-access-c8kwj\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.608742 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-config-data\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.617350 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-2vpt9"] Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.619245 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.661515 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-2vpt9"] Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.710770 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4798a48d-9fdb-40d8-9890-595874a05215-combined-ca-bundle\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.710843 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.710876 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-config-data\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.710909 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4798a48d-9fdb-40d8-9890-595874a05215-config-data\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.710945 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4798a48d-9fdb-40d8-9890-595874a05215-config-data-custom\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.710973 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.711005 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdrv2\" (UniqueName: \"kubernetes.io/projected/4798a48d-9fdb-40d8-9890-595874a05215-kube-api-access-tdrv2\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.711039 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.711058 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-config-data-custom\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.711092 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-config\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.711133 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-logs\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.711151 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.711174 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-combined-ca-bundle\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.711193 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8kwj\" (UniqueName: \"kubernetes.io/projected/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-kube-api-access-c8kwj\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.711209 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pspqc\" (UniqueName: \"kubernetes.io/projected/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-kube-api-access-pspqc\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.711227 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4798a48d-9fdb-40d8-9890-595874a05215-logs\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.716982 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-logs\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.732926 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-config-data\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.739615 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-config-data-custom\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.741656 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-combined-ca-bundle\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.765020 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8kwj\" (UniqueName: \"kubernetes.io/projected/93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863-kube-api-access-c8kwj\") pod \"barbican-worker-69fff79bd9-55nbl\" (UID: \"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863\") " pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.798797 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8fc8c65dd-5rdnt"] Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.799821 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69fff79bd9-55nbl" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.800782 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.805211 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.813404 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.813586 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pspqc\" (UniqueName: \"kubernetes.io/projected/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-kube-api-access-pspqc\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.813622 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4798a48d-9fdb-40d8-9890-595874a05215-logs\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.813704 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4798a48d-9fdb-40d8-9890-595874a05215-combined-ca-bundle\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.813756 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.813793 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4798a48d-9fdb-40d8-9890-595874a05215-config-data\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.813848 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4798a48d-9fdb-40d8-9890-595874a05215-config-data-custom\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.813876 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.813931 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdrv2\" (UniqueName: \"kubernetes.io/projected/4798a48d-9fdb-40d8-9890-595874a05215-kube-api-access-tdrv2\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.814010 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.814050 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-config\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.815560 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-swift-storage-0\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.815600 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-svc\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.815836 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-config\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.817138 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-nb\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.826495 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8fc8c65dd-5rdnt"] Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.833790 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4798a48d-9fdb-40d8-9890-595874a05215-logs\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.837890 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4798a48d-9fdb-40d8-9890-595874a05215-config-data-custom\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.843047 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4798a48d-9fdb-40d8-9890-595874a05215-combined-ca-bundle\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.848580 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-sb\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.850355 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4798a48d-9fdb-40d8-9890-595874a05215-config-data\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.862363 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdrv2\" (UniqueName: \"kubernetes.io/projected/4798a48d-9fdb-40d8-9890-595874a05215-kube-api-access-tdrv2\") pod \"barbican-keystone-listener-5f4bb4f7d-ntfxs\" (UID: \"4798a48d-9fdb-40d8-9890-595874a05215\") " pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.870356 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pspqc\" (UniqueName: \"kubernetes.io/projected/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-kube-api-access-pspqc\") pod \"dnsmasq-dns-7bdf86f46f-2vpt9\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.906663 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.916517 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data-custom\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.916573 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-combined-ca-bundle\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.916619 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkqsp\" (UniqueName: \"kubernetes.io/projected/671ea9f9-7bde-4f11-8793-67f21663f8ec-kube-api-access-dkqsp\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.916641 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/671ea9f9-7bde-4f11-8793-67f21663f8ec-logs\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.916673 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:24:59 crc kubenswrapper[4935]: I1217 09:24:59.948715 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.018769 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data-custom\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.018842 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-combined-ca-bundle\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.018945 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkqsp\" (UniqueName: \"kubernetes.io/projected/671ea9f9-7bde-4f11-8793-67f21663f8ec-kube-api-access-dkqsp\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.018971 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/671ea9f9-7bde-4f11-8793-67f21663f8ec-logs\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.019004 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.027384 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data-custom\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.028676 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.029356 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/671ea9f9-7bde-4f11-8793-67f21663f8ec-logs\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.031894 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-combined-ca-bundle\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.050459 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkqsp\" (UniqueName: \"kubernetes.io/projected/671ea9f9-7bde-4f11-8793-67f21663f8ec-kube-api-access-dkqsp\") pod \"barbican-api-8fc8c65dd-5rdnt\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.292076 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:00 crc kubenswrapper[4935]: I1217 09:25:00.416597 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69fff79bd9-55nbl"] Dec 17 09:25:01 crc kubenswrapper[4935]: I1217 09:25:01.288085 4935 generic.go:334] "Generic (PLEG): container finished" podID="9b17c8be-6039-4aa6-8227-cd2dfc076f77" containerID="9f29fe5d93c405192589ace7b8e38b12b14a17c2766dff63747094e4a9a117ce" exitCode=0 Dec 17 09:25:01 crc kubenswrapper[4935]: I1217 09:25:01.288588 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f7nmq" event={"ID":"9b17c8be-6039-4aa6-8227-cd2dfc076f77","Type":"ContainerDied","Data":"9f29fe5d93c405192589ace7b8e38b12b14a17c2766dff63747094e4a9a117ce"} Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.052971 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5b5f4b44fb-ktsl4"] Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.055164 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.057111 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.058641 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.074567 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b5f4b44fb-ktsl4"] Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.204642 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-internal-tls-certs\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.205184 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-public-tls-certs\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.205210 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-config-data-custom\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.205241 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-config-data\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.205508 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ef2a77-ca25-495d-a00c-f15993955019-logs\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.205782 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-combined-ca-bundle\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.205859 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrrj2\" (UniqueName: \"kubernetes.io/projected/e4ef2a77-ca25-495d-a00c-f15993955019-kube-api-access-lrrj2\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.308370 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrrj2\" (UniqueName: \"kubernetes.io/projected/e4ef2a77-ca25-495d-a00c-f15993955019-kube-api-access-lrrj2\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.308437 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-internal-tls-certs\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.308470 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-public-tls-certs\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.308502 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-config-data-custom\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.308530 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-config-data\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.308896 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ef2a77-ca25-495d-a00c-f15993955019-logs\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.308916 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-combined-ca-bundle\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.310582 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ef2a77-ca25-495d-a00c-f15993955019-logs\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.317219 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-public-tls-certs\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.318541 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-config-data\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.319063 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-internal-tls-certs\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.322167 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-config-data-custom\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.324198 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ef2a77-ca25-495d-a00c-f15993955019-combined-ca-bundle\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.343039 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrrj2\" (UniqueName: \"kubernetes.io/projected/e4ef2a77-ca25-495d-a00c-f15993955019-kube-api-access-lrrj2\") pod \"barbican-api-5b5f4b44fb-ktsl4\" (UID: \"e4ef2a77-ca25-495d-a00c-f15993955019\") " pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:03 crc kubenswrapper[4935]: I1217 09:25:03.380949 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:05 crc kubenswrapper[4935]: E1217 09:25:05.167777 4935 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/4127a4dbcf84993dda630ee36221ebdee8987891886ee15e9a5591e6eedeb62a/diff" to get inode usage: stat /var/lib/containers/storage/overlay/4127a4dbcf84993dda630ee36221ebdee8987891886ee15e9a5591e6eedeb62a/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_dnsmasq-dns-5dc4fcdbc-nl2wv_fe048f60-c05c-4561-88bc-42d1b9eecd6c/dnsmasq-dns/0.log" to get inode usage: stat /var/log/pods/openstack_dnsmasq-dns-5dc4fcdbc-nl2wv_fe048f60-c05c-4561-88bc-42d1b9eecd6c/dnsmasq-dns/0.log: no such file or directory Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.239667 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.371450 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-f7nmq" event={"ID":"9b17c8be-6039-4aa6-8227-cd2dfc076f77","Type":"ContainerDied","Data":"9b25756dd849c95ef48605f4c64fc6e91c80c6ece3efdf637ce80865932b1bb6"} Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.371584 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-f7nmq" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.371680 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b25756dd849c95ef48605f4c64fc6e91c80c6ece3efdf637ce80865932b1bb6" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.372991 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69fff79bd9-55nbl" event={"ID":"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863","Type":"ContainerStarted","Data":"04e76b344961e21e0eb99f1e0418e64d2eb6f74b6e8763c8021cb0a13e5b1bae"} Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.396155 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-db-sync-config-data\") pod \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.396213 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-config-data\") pod \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.396408 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b17c8be-6039-4aa6-8227-cd2dfc076f77-etc-machine-id\") pod \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.396508 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9b17c8be-6039-4aa6-8227-cd2dfc076f77-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9b17c8be-6039-4aa6-8227-cd2dfc076f77" (UID: "9b17c8be-6039-4aa6-8227-cd2dfc076f77"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.396607 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-scripts\") pod \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.397321 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-combined-ca-bundle\") pod \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.397405 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9sct\" (UniqueName: \"kubernetes.io/projected/9b17c8be-6039-4aa6-8227-cd2dfc076f77-kube-api-access-x9sct\") pod \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\" (UID: \"9b17c8be-6039-4aa6-8227-cd2dfc076f77\") " Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.397995 4935 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9b17c8be-6039-4aa6-8227-cd2dfc076f77-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.410480 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-scripts" (OuterVolumeSpecName: "scripts") pod "9b17c8be-6039-4aa6-8227-cd2dfc076f77" (UID: "9b17c8be-6039-4aa6-8227-cd2dfc076f77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.410549 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9b17c8be-6039-4aa6-8227-cd2dfc076f77" (UID: "9b17c8be-6039-4aa6-8227-cd2dfc076f77"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.410703 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b17c8be-6039-4aa6-8227-cd2dfc076f77-kube-api-access-x9sct" (OuterVolumeSpecName: "kube-api-access-x9sct") pod "9b17c8be-6039-4aa6-8227-cd2dfc076f77" (UID: "9b17c8be-6039-4aa6-8227-cd2dfc076f77"). InnerVolumeSpecName "kube-api-access-x9sct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.443350 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b17c8be-6039-4aa6-8227-cd2dfc076f77" (UID: "9b17c8be-6039-4aa6-8227-cd2dfc076f77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.457405 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-config-data" (OuterVolumeSpecName: "config-data") pod "9b17c8be-6039-4aa6-8227-cd2dfc076f77" (UID: "9b17c8be-6039-4aa6-8227-cd2dfc076f77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.500346 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.500416 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.500432 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9sct\" (UniqueName: \"kubernetes.io/projected/9b17c8be-6039-4aa6-8227-cd2dfc076f77-kube-api-access-x9sct\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.500444 4935 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:06 crc kubenswrapper[4935]: I1217 09:25:06.500454 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b17c8be-6039-4aa6-8227-cd2dfc076f77-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.403832 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.652639 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 17 09:25:07 crc kubenswrapper[4935]: E1217 09:25:07.653598 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b17c8be-6039-4aa6-8227-cd2dfc076f77" containerName="cinder-db-sync" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.653627 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b17c8be-6039-4aa6-8227-cd2dfc076f77" containerName="cinder-db-sync" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.654103 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b17c8be-6039-4aa6-8227-cd2dfc076f77" containerName="cinder-db-sync" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.655444 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.667394 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.668144 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-m6hmm" Dec 17 09:25:07 crc kubenswrapper[4935]: W1217 09:25:07.668292 4935 reflector.go:561] object-"openstack"/"cinder-scheduler-config-data": failed to list *v1.Secret: secrets "cinder-scheduler-config-data" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack": no relationship found between node 'crc' and this object Dec 17 09:25:07 crc kubenswrapper[4935]: E1217 09:25:07.668328 4935 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"cinder-scheduler-config-data\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"cinder-scheduler-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack\": no relationship found between node 'crc' and this object" logger="UnhandledError" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.668454 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.686138 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.737417 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.738603 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.738829 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wng86\" (UniqueName: \"kubernetes.io/projected/d273eade-a4a7-4f15-92e0-058d24689c84-kube-api-access-wng86\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.746676 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-scripts\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.746818 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.746858 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d273eade-a4a7-4f15-92e0-058d24689c84-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.851786 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wng86\" (UniqueName: \"kubernetes.io/projected/d273eade-a4a7-4f15-92e0-058d24689c84-kube-api-access-wng86\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.851866 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-scripts\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.851917 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.851953 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d273eade-a4a7-4f15-92e0-058d24689c84-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.851981 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.852017 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.853419 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d273eade-a4a7-4f15-92e0-058d24689c84-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.894689 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.894960 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.897019 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-scripts\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.905103 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wng86\" (UniqueName: \"kubernetes.io/projected/d273eade-a4a7-4f15-92e0-058d24689c84-kube-api-access-wng86\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:07 crc kubenswrapper[4935]: I1217 09:25:07.917120 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-2vpt9"] Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.030666 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-cbvf9"] Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.053761 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.101193 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-cbvf9"] Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.197919 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.198469 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.198525 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-config\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.198611 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncnr5\" (UniqueName: \"kubernetes.io/projected/7ae87ac8-5831-4714-bd2a-806b03f485aa-kube-api-access-ncnr5\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.198678 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.198732 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.228116 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.244298 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.248256 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.253808 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.300358 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.300449 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.300508 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.300536 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.300605 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-config\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.300682 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncnr5\" (UniqueName: \"kubernetes.io/projected/7ae87ac8-5831-4714-bd2a-806b03f485aa-kube-api-access-ncnr5\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.301798 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.301860 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.304821 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.304846 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.304971 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-config\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.326621 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncnr5\" (UniqueName: \"kubernetes.io/projected/7ae87ac8-5831-4714-bd2a-806b03f485aa-kube-api-access-ncnr5\") pod \"dnsmasq-dns-75bfc9b94f-cbvf9\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: W1217 09:25:08.389584 4935 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda62d0f30_735b_410e_ac80_50a98636ff47.slice/crio-conmon-99d2e1e9d43e2b0a5452b48ccbdb667d894a00115adaa79add2f82ae5064307e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda62d0f30_735b_410e_ac80_50a98636ff47.slice/crio-conmon-99d2e1e9d43e2b0a5452b48ccbdb667d894a00115adaa79add2f82ae5064307e.scope: no such file or directory Dec 17 09:25:08 crc kubenswrapper[4935]: W1217 09:25:08.389791 4935 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda62d0f30_735b_410e_ac80_50a98636ff47.slice/crio-99d2e1e9d43e2b0a5452b48ccbdb667d894a00115adaa79add2f82ae5064307e.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda62d0f30_735b_410e_ac80_50a98636ff47.slice/crio-99d2e1e9d43e2b0a5452b48ccbdb667d894a00115adaa79add2f82ae5064307e.scope: no such file or directory Dec 17 09:25:08 crc kubenswrapper[4935]: W1217 09:25:08.390092 4935 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b17c8be_6039_4aa6_8227_cd2dfc076f77.slice/crio-conmon-9f29fe5d93c405192589ace7b8e38b12b14a17c2766dff63747094e4a9a117ce.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b17c8be_6039_4aa6_8227_cd2dfc076f77.slice/crio-conmon-9f29fe5d93c405192589ace7b8e38b12b14a17c2766dff63747094e4a9a117ce.scope: no such file or directory Dec 17 09:25:08 crc kubenswrapper[4935]: W1217 09:25:08.390122 4935 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b17c8be_6039_4aa6_8227_cd2dfc076f77.slice/crio-9f29fe5d93c405192589ace7b8e38b12b14a17c2766dff63747094e4a9a117ce.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b17c8be_6039_4aa6_8227_cd2dfc076f77.slice/crio-9f29fe5d93c405192589ace7b8e38b12b14a17c2766dff63747094e4a9a117ce.scope: no such file or directory Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.403985 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.404036 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfdadde0-a814-47bc-a025-db93796c38c4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.404079 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.404197 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.404222 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzj9x\" (UniqueName: \"kubernetes.io/projected/bfdadde0-a814-47bc-a025-db93796c38c4-kube-api-access-xzj9x\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.404240 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-scripts\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.404387 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdadde0-a814-47bc-a025-db93796c38c4-logs\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.466618 4935 generic.go:334] "Generic (PLEG): container finished" podID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" containerID="14ae31a0284d360b60ba5de97c50ba8651c3950f140b682879e1e00c6dbfa715" exitCode=137 Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.466717 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687fb4989-g2n7c" event={"ID":"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7","Type":"ContainerDied","Data":"14ae31a0284d360b60ba5de97c50ba8651c3950f140b682879e1e00c6dbfa715"} Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.478510 4935 generic.go:334] "Generic (PLEG): container finished" podID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" containerID="69bda1557a71e6e1a0ed1d1935c9fc2f652c5445904e049de5ba53ddbc652d93" exitCode=137 Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.478550 4935 generic.go:334] "Generic (PLEG): container finished" podID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" containerID="9eaac25bcefbb36ab7789b5708c2696617578220b4e144bda5955dac76763fed" exitCode=137 Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.478585 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f667bdd5-px4gj" event={"ID":"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc","Type":"ContainerDied","Data":"69bda1557a71e6e1a0ed1d1935c9fc2f652c5445904e049de5ba53ddbc652d93"} Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.478619 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f667bdd5-px4gj" event={"ID":"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc","Type":"ContainerDied","Data":"9eaac25bcefbb36ab7789b5708c2696617578220b4e144bda5955dac76763fed"} Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.507558 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzj9x\" (UniqueName: \"kubernetes.io/projected/bfdadde0-a814-47bc-a025-db93796c38c4-kube-api-access-xzj9x\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.507609 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-scripts\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.507649 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdadde0-a814-47bc-a025-db93796c38c4-logs\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.507690 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.507706 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfdadde0-a814-47bc-a025-db93796c38c4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.507744 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.507841 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.508067 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfdadde0-a814-47bc-a025-db93796c38c4-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.511686 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdadde0-a814-47bc-a025-db93796c38c4-logs\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.515466 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.516362 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data-custom\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.517145 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.521621 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-scripts\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.535609 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzj9x\" (UniqueName: \"kubernetes.io/projected/bfdadde0-a814-47bc-a025-db93796c38c4-kube-api-access-xzj9x\") pod \"cinder-api-0\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: E1217 09:25:08.613072 4935 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cca130e_5dbc_4edb_b0d9_f04a0ecd2ea6.slice/crio-9a929878615707b4cd4504b26cee0370dab037a81d470423516668afc2cc6c70\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe048f60_c05c_4561_88bc_42d1b9eecd6c.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b52e650_9c70_4617_9fbb_12fbb5a1c3e0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda62d0f30_735b_410e_ac80_50a98636ff47.slice/crio-146a5d0345e2e6feddae3f48799ff20c4811440e1a611f4126bce480d2675319\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b52e650_9c70_4617_9fbb_12fbb5a1c3e0.slice/crio-3ea148c6b6a672fe1cdb81351986afd65cd19564e042063d59a6a01552c2d554.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b865a47_3ef5_4de0_88f8_4eeaba5de2cc.slice/crio-69bda1557a71e6e1a0ed1d1935c9fc2f652c5445904e049de5ba53ddbc652d93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b52e650_9c70_4617_9fbb_12fbb5a1c3e0.slice/crio-conmon-3ea148c6b6a672fe1cdb81351986afd65cd19564e042063d59a6a01552c2d554.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda62d0f30_735b_410e_ac80_50a98636ff47.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b865a47_3ef5_4de0_88f8_4eeaba5de2cc.slice/crio-conmon-69bda1557a71e6e1a0ed1d1935c9fc2f652c5445904e049de5ba53ddbc652d93.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb867f99b_0bea_4d24_88e7_4dc1c1f991e6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cca130e_5dbc_4edb_b0d9_f04a0ecd2ea6.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded5ba8df_52ae_47ef_ad50_8ccb05fa65d7.slice/crio-14ae31a0284d360b60ba5de97c50ba8651c3950f140b682879e1e00c6dbfa715.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb867f99b_0bea_4d24_88e7_4dc1c1f991e6.slice/crio-74b18765fd2e15e774860dba087b7765cd444eb7513801fbedd0868947d51bbb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded5ba8df_52ae_47ef_ad50_8ccb05fa65d7.slice/crio-conmon-14ae31a0284d360b60ba5de97c50ba8651c3950f140b682879e1e00c6dbfa715.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb867f99b_0bea_4d24_88e7_4dc1c1f991e6.slice/crio-conmon-42c8ce9d58026ac8daf60247c2367152c2c018b6ea58efe28a489e158d919d97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b17c8be_6039_4aa6_8227_cd2dfc076f77.slice/crio-9b25756dd849c95ef48605f4c64fc6e91c80c6ece3efdf637ce80865932b1bb6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb867f99b_0bea_4d24_88e7_4dc1c1f991e6.slice/crio-42c8ce9d58026ac8daf60247c2367152c2c018b6ea58efe28a489e158d919d97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe048f60_c05c_4561_88bc_42d1b9eecd6c.slice/crio-c092f40bfeea17a4782fdd221416617988c696650416e92c6f8d0324413851f8\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b865a47_3ef5_4de0_88f8_4eeaba5de2cc.slice/crio-conmon-9eaac25bcefbb36ab7789b5708c2696617578220b4e144bda5955dac76763fed.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b865a47_3ef5_4de0_88f8_4eeaba5de2cc.slice/crio-9eaac25bcefbb36ab7789b5708c2696617578220b4e144bda5955dac76763fed.scope\": RecentStats: unable to find data in memory cache]" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.655519 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:08 crc kubenswrapper[4935]: I1217 09:25:08.665658 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 17 09:25:08 crc kubenswrapper[4935]: E1217 09:25:08.852354 4935 secret.go:188] Couldn't get secret openstack/cinder-scheduler-config-data: failed to sync secret cache: timed out waiting for the condition Dec 17 09:25:08 crc kubenswrapper[4935]: E1217 09:25:08.852468 4935 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data-custom podName:d273eade-a4a7-4f15-92e0-058d24689c84 nodeName:}" failed. No retries permitted until 2025-12-17 09:25:09.352448029 +0000 UTC m=+1229.012288792 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data-custom") pod "cinder-scheduler-0" (UID: "d273eade-a4a7-4f15-92e0-058d24689c84") : failed to sync secret cache: timed out waiting for the condition Dec 17 09:25:09 crc kubenswrapper[4935]: I1217 09:25:09.174291 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 17 09:25:09 crc kubenswrapper[4935]: E1217 09:25:09.197801 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/ubi9/httpd-24@sha256:472914fd4d171b72c8b0c80dd34245cd6fb519807d08b89164c265d0b35c0052" Dec 17 09:25:09 crc kubenswrapper[4935]: E1217 09:25:09.198459 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:proxy-httpd,Image:registry.redhat.io/ubi9/httpd-24@sha256:472914fd4d171b72c8b0c80dd34245cd6fb519807d08b89164c265d0b35c0052,Command:[/usr/sbin/httpd],Args:[-DFOREGROUND],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:proxy-httpd,HostPort:0,ContainerPort:3000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf/httpd.conf,SubPath:httpd.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/httpd/conf.d/ssl.conf,SubPath:ssl.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run-httpd,ReadOnly:false,MountPath:/run/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log-httpd,ReadOnly:false,MountPath:/var/log/httpd,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-plc9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/,Port:{0 3000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:30,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ee747235-93a8-420f-95b6-232cdf8a3223): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 17 09:25:09 crc kubenswrapper[4935]: E1217 09:25:09.199804 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"proxy-httpd\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" Dec 17 09:25:09 crc kubenswrapper[4935]: I1217 09:25:09.428151 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:09 crc kubenswrapper[4935]: I1217 09:25:09.437998 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:09 crc kubenswrapper[4935]: I1217 09:25:09.496331 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 17 09:25:09 crc kubenswrapper[4935]: I1217 09:25:09.564892 4935 generic.go:334] "Generic (PLEG): container finished" podID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" containerID="32750d33a6bf3e214de395af044fa80bde112e7d6d36df1f148c3380da508b17" exitCode=137 Dec 17 09:25:09 crc kubenswrapper[4935]: I1217 09:25:09.565090 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="ceilometer-central-agent" containerID="cri-o://33e694debc362dac64987513367abf8519814666ab8bf4d62e9ac950f0001602" gracePeriod=30 Dec 17 09:25:09 crc kubenswrapper[4935]: I1217 09:25:09.565311 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687fb4989-g2n7c" event={"ID":"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7","Type":"ContainerDied","Data":"32750d33a6bf3e214de395af044fa80bde112e7d6d36df1f148c3380da508b17"} Dec 17 09:25:09 crc kubenswrapper[4935]: I1217 09:25:09.565526 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="ceilometer-notification-agent" containerID="cri-o://0ba2fc44e91ee7ba44310f25eaf57f763e280313b2746fec7830e9ec68e45894" gracePeriod=30 Dec 17 09:25:09 crc kubenswrapper[4935]: I1217 09:25:09.565644 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="sg-core" containerID="cri-o://b59fef636375afb6a8b1c06ccfd489e69629eb432f6376cb2e3c477ba0298ae5" gracePeriod=30 Dec 17 09:25:09 crc kubenswrapper[4935]: I1217 09:25:09.979202 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.081852 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-config-data\") pod \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.081978 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-scripts\") pod \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.082036 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-logs\") pod \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.082157 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpptj\" (UniqueName: \"kubernetes.io/projected/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-kube-api-access-cpptj\") pod \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.082249 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-horizon-secret-key\") pod \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\" (UID: \"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc\") " Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.083669 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-logs" (OuterVolumeSpecName: "logs") pod "8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" (UID: "8b865a47-3ef5-4de0-88f8-4eeaba5de2cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.105454 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" (UID: "8b865a47-3ef5-4de0-88f8-4eeaba5de2cc"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.110150 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-kube-api-access-cpptj" (OuterVolumeSpecName: "kube-api-access-cpptj") pod "8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" (UID: "8b865a47-3ef5-4de0-88f8-4eeaba5de2cc"). InnerVolumeSpecName "kube-api-access-cpptj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.130134 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-config-data" (OuterVolumeSpecName: "config-data") pod "8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" (UID: "8b865a47-3ef5-4de0-88f8-4eeaba5de2cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.147170 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-scripts" (OuterVolumeSpecName: "scripts") pod "8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" (UID: "8b865a47-3ef5-4de0-88f8-4eeaba5de2cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.158500 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.184334 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.184371 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpptj\" (UniqueName: \"kubernetes.io/projected/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-kube-api-access-cpptj\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.184385 4935 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.184397 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.184409 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.286106 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-horizon-secret-key\") pod \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.286637 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-config-data\") pod \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.286690 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-scripts\") pod \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.286768 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-logs\") pod \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.286802 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhb7x\" (UniqueName: \"kubernetes.io/projected/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-kube-api-access-hhb7x\") pod \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\" (UID: \"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7\") " Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.292766 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-logs" (OuterVolumeSpecName: "logs") pod "ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" (UID: "ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.327825 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" (UID: "ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.331017 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-kube-api-access-hhb7x" (OuterVolumeSpecName: "kube-api-access-hhb7x") pod "ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" (UID: "ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7"). InnerVolumeSpecName "kube-api-access-hhb7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.380664 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-config-data" (OuterVolumeSpecName: "config-data") pod "ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" (UID: "ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.398129 4935 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.398165 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.398175 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.398187 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhb7x\" (UniqueName: \"kubernetes.io/projected/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-kube-api-access-hhb7x\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.427590 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-scripts" (OuterVolumeSpecName: "scripts") pod "ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" (UID: "ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.500979 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.576088 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-64f667bdd5-px4gj" event={"ID":"8b865a47-3ef5-4de0-88f8-4eeaba5de2cc","Type":"ContainerDied","Data":"10a0127f3352bde2295d276bf724d2c1d2fe80a9913c80791827a8053637abc7"} Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.576157 4935 scope.go:117] "RemoveContainer" containerID="69bda1557a71e6e1a0ed1d1935c9fc2f652c5445904e049de5ba53ddbc652d93" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.576330 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-64f667bdd5-px4gj" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.596014 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-687fb4989-g2n7c" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.596021 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-687fb4989-g2n7c" event={"ID":"ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7","Type":"ContainerDied","Data":"a55f130981aac9a3c39485e3603f9df47ca81da4f60e2f47317b58b6103230b4"} Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.601487 4935 generic.go:334] "Generic (PLEG): container finished" podID="ee747235-93a8-420f-95b6-232cdf8a3223" containerID="b59fef636375afb6a8b1c06ccfd489e69629eb432f6376cb2e3c477ba0298ae5" exitCode=2 Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.601633 4935 generic.go:334] "Generic (PLEG): container finished" podID="ee747235-93a8-420f-95b6-232cdf8a3223" containerID="33e694debc362dac64987513367abf8519814666ab8bf4d62e9ac950f0001602" exitCode=0 Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.601661 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee747235-93a8-420f-95b6-232cdf8a3223","Type":"ContainerDied","Data":"b59fef636375afb6a8b1c06ccfd489e69629eb432f6376cb2e3c477ba0298ae5"} Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.601698 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee747235-93a8-420f-95b6-232cdf8a3223","Type":"ContainerDied","Data":"33e694debc362dac64987513367abf8519814666ab8bf4d62e9ac950f0001602"} Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.672437 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-64f667bdd5-px4gj"] Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.732828 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-64f667bdd5-px4gj"] Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.741639 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-687fb4989-g2n7c"] Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.751426 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-687fb4989-g2n7c"] Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.790944 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7c8457c6df-5qkkl" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.880494 4935 scope.go:117] "RemoveContainer" containerID="9eaac25bcefbb36ab7789b5708c2696617578220b4e144bda5955dac76763fed" Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.900880 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84f77988b8-57qwv"] Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.901209 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84f77988b8-57qwv" podUID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" containerName="neutron-api" containerID="cri-o://3530ad080d1edcab525f47d2d9a66c1e08f558003b8a420933927b17595a5589" gracePeriod=30 Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.901930 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-84f77988b8-57qwv" podUID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" containerName="neutron-httpd" containerID="cri-o://780433440bfe8714d3a013717e8a34ddf923ead49f1825269886dcb6621a5661" gracePeriod=30 Dec 17 09:25:10 crc kubenswrapper[4935]: I1217 09:25:10.964528 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b5f4b44fb-ktsl4"] Dec 17 09:25:11 crc kubenswrapper[4935]: I1217 09:25:11.014366 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs"] Dec 17 09:25:11 crc kubenswrapper[4935]: I1217 09:25:11.025331 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-cbvf9"] Dec 17 09:25:11 crc kubenswrapper[4935]: I1217 09:25:11.090340 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8fc8c65dd-5rdnt"] Dec 17 09:25:11 crc kubenswrapper[4935]: I1217 09:25:11.184895 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" path="/var/lib/kubelet/pods/8b865a47-3ef5-4de0-88f8-4eeaba5de2cc/volumes" Dec 17 09:25:11 crc kubenswrapper[4935]: I1217 09:25:11.186176 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" path="/var/lib/kubelet/pods/ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7/volumes" Dec 17 09:25:11 crc kubenswrapper[4935]: I1217 09:25:11.195523 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 17 09:25:11 crc kubenswrapper[4935]: I1217 09:25:11.393378 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 17 09:25:11 crc kubenswrapper[4935]: I1217 09:25:11.409370 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-2vpt9"] Dec 17 09:25:11 crc kubenswrapper[4935]: I1217 09:25:11.496599 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 17 09:25:12 crc kubenswrapper[4935]: W1217 09:25:12.349512 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod671ea9f9_7bde_4f11_8793_67f21663f8ec.slice/crio-ff66392d9438c5568e9325aaef17830f6f3789b0feb10955ce3fc4ab04e2de88 WatchSource:0}: Error finding container ff66392d9438c5568e9325aaef17830f6f3789b0feb10955ce3fc4ab04e2de88: Status 404 returned error can't find the container with id ff66392d9438c5568e9325aaef17830f6f3789b0feb10955ce3fc4ab04e2de88 Dec 17 09:25:12 crc kubenswrapper[4935]: W1217 09:25:12.376800 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd273eade_a4a7_4f15_92e0_058d24689c84.slice/crio-fe1af8815ce704333068574d6d6dbe8081db9c43846a76a1c757fc504d6c4821 WatchSource:0}: Error finding container fe1af8815ce704333068574d6d6dbe8081db9c43846a76a1c757fc504d6c4821: Status 404 returned error can't find the container with id fe1af8815ce704333068574d6d6dbe8081db9c43846a76a1c757fc504d6c4821 Dec 17 09:25:12 crc kubenswrapper[4935]: W1217 09:25:12.386584 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4798a48d_9fdb_40d8_9890_595874a05215.slice/crio-a9375f7c1c023a4aa4fa8935a45334e676b3eff863956ae336f2e4b5c3c44d67 WatchSource:0}: Error finding container a9375f7c1c023a4aa4fa8935a45334e676b3eff863956ae336f2e4b5c3c44d67: Status 404 returned error can't find the container with id a9375f7c1c023a4aa4fa8935a45334e676b3eff863956ae336f2e4b5c3c44d67 Dec 17 09:25:12 crc kubenswrapper[4935]: W1217 09:25:12.396420 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4ef2a77_ca25_495d_a00c_f15993955019.slice/crio-25441c7d9a3b48ceeabd4ff6b0e562ffec0f90ed3aa6b6dc8ef61f12c5820b26 WatchSource:0}: Error finding container 25441c7d9a3b48ceeabd4ff6b0e562ffec0f90ed3aa6b6dc8ef61f12c5820b26: Status 404 returned error can't find the container with id 25441c7d9a3b48ceeabd4ff6b0e562ffec0f90ed3aa6b6dc8ef61f12c5820b26 Dec 17 09:25:12 crc kubenswrapper[4935]: W1217 09:25:12.403486 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ae87ac8_5831_4714_bd2a_806b03f485aa.slice/crio-50a805a55c2291130e5ebd5f37cd122970e0332f1fb47459be872f8a8e84de64 WatchSource:0}: Error finding container 50a805a55c2291130e5ebd5f37cd122970e0332f1fb47459be872f8a8e84de64: Status 404 returned error can't find the container with id 50a805a55c2291130e5ebd5f37cd122970e0332f1fb47459be872f8a8e84de64 Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.430635 4935 scope.go:117] "RemoveContainer" containerID="32750d33a6bf3e214de395af044fa80bde112e7d6d36df1f148c3380da508b17" Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.682632 4935 generic.go:334] "Generic (PLEG): container finished" podID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" containerID="780433440bfe8714d3a013717e8a34ddf923ead49f1825269886dcb6621a5661" exitCode=0 Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.683361 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f77988b8-57qwv" event={"ID":"6796ebed-95e9-4af3-a76f-0acc484ddbfb","Type":"ContainerDied","Data":"780433440bfe8714d3a013717e8a34ddf923ead49f1825269886dcb6621a5661"} Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.688994 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfdadde0-a814-47bc-a025-db93796c38c4","Type":"ContainerStarted","Data":"625f7f611b1f29028338da8f9e96c8badc029378089f04cc6ccc27235f3f5919"} Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.702186 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" event={"ID":"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4","Type":"ContainerStarted","Data":"97c36558e9fe80dd708203be19e1ad451de7f7ce5d01484996a7c43b3e4b87be"} Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.721899 4935 scope.go:117] "RemoveContainer" containerID="14ae31a0284d360b60ba5de97c50ba8651c3950f140b682879e1e00c6dbfa715" Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.735634 4935 generic.go:334] "Generic (PLEG): container finished" podID="ee747235-93a8-420f-95b6-232cdf8a3223" containerID="0ba2fc44e91ee7ba44310f25eaf57f763e280313b2746fec7830e9ec68e45894" exitCode=0 Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.735773 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee747235-93a8-420f-95b6-232cdf8a3223","Type":"ContainerDied","Data":"0ba2fc44e91ee7ba44310f25eaf57f763e280313b2746fec7830e9ec68e45894"} Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.759706 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" event={"ID":"7ae87ac8-5831-4714-bd2a-806b03f485aa","Type":"ContainerStarted","Data":"50a805a55c2291130e5ebd5f37cd122970e0332f1fb47459be872f8a8e84de64"} Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.763166 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" event={"ID":"e4ef2a77-ca25-495d-a00c-f15993955019","Type":"ContainerStarted","Data":"25441c7d9a3b48ceeabd4ff6b0e562ffec0f90ed3aa6b6dc8ef61f12c5820b26"} Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.764919 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8fc8c65dd-5rdnt" event={"ID":"671ea9f9-7bde-4f11-8793-67f21663f8ec","Type":"ContainerStarted","Data":"ff66392d9438c5568e9325aaef17830f6f3789b0feb10955ce3fc4ab04e2de88"} Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.766250 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" event={"ID":"4798a48d-9fdb-40d8-9890-595874a05215","Type":"ContainerStarted","Data":"a9375f7c1c023a4aa4fa8935a45334e676b3eff863956ae336f2e4b5c3c44d67"} Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.767404 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d273eade-a4a7-4f15-92e0-058d24689c84","Type":"ContainerStarted","Data":"fe1af8815ce704333068574d6d6dbe8081db9c43846a76a1c757fc504d6c4821"} Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.817321 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.947397 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-run-httpd\") pod \"ee747235-93a8-420f-95b6-232cdf8a3223\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.947588 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-scripts\") pod \"ee747235-93a8-420f-95b6-232cdf8a3223\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.947974 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ee747235-93a8-420f-95b6-232cdf8a3223" (UID: "ee747235-93a8-420f-95b6-232cdf8a3223"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.949721 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plc9n\" (UniqueName: \"kubernetes.io/projected/ee747235-93a8-420f-95b6-232cdf8a3223-kube-api-access-plc9n\") pod \"ee747235-93a8-420f-95b6-232cdf8a3223\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.949800 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-log-httpd\") pod \"ee747235-93a8-420f-95b6-232cdf8a3223\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.949824 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-config-data\") pod \"ee747235-93a8-420f-95b6-232cdf8a3223\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.950173 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ee747235-93a8-420f-95b6-232cdf8a3223" (UID: "ee747235-93a8-420f-95b6-232cdf8a3223"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.950249 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-combined-ca-bundle\") pod \"ee747235-93a8-420f-95b6-232cdf8a3223\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.950315 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-sg-core-conf-yaml\") pod \"ee747235-93a8-420f-95b6-232cdf8a3223\" (UID: \"ee747235-93a8-420f-95b6-232cdf8a3223\") " Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.962586 4935 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:12 crc kubenswrapper[4935]: I1217 09:25:12.962678 4935 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee747235-93a8-420f-95b6-232cdf8a3223-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.002983 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-scripts" (OuterVolumeSpecName: "scripts") pod "ee747235-93a8-420f-95b6-232cdf8a3223" (UID: "ee747235-93a8-420f-95b6-232cdf8a3223"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.046530 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee747235-93a8-420f-95b6-232cdf8a3223-kube-api-access-plc9n" (OuterVolumeSpecName: "kube-api-access-plc9n") pod "ee747235-93a8-420f-95b6-232cdf8a3223" (UID: "ee747235-93a8-420f-95b6-232cdf8a3223"). InnerVolumeSpecName "kube-api-access-plc9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.066905 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.066967 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plc9n\" (UniqueName: \"kubernetes.io/projected/ee747235-93a8-420f-95b6-232cdf8a3223-kube-api-access-plc9n\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.281878 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ee747235-93a8-420f-95b6-232cdf8a3223" (UID: "ee747235-93a8-420f-95b6-232cdf8a3223"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.302489 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.326121 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee747235-93a8-420f-95b6-232cdf8a3223" (UID: "ee747235-93a8-420f-95b6-232cdf8a3223"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.363468 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-config-data" (OuterVolumeSpecName: "config-data") pod "ee747235-93a8-420f-95b6-232cdf8a3223" (UID: "ee747235-93a8-420f-95b6-232cdf8a3223"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.377987 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.378672 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.378703 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.378720 4935 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee747235-93a8-420f-95b6-232cdf8a3223-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.785949 4935 generic.go:334] "Generic (PLEG): container finished" podID="6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" containerID="03b2fbb557ad5068b1b9d736c30c813732acb8a964a30966917d75f68ff13c6e" exitCode=0 Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.786021 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" event={"ID":"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4","Type":"ContainerDied","Data":"03b2fbb557ad5068b1b9d736c30c813732acb8a964a30966917d75f68ff13c6e"} Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.793120 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee747235-93a8-420f-95b6-232cdf8a3223","Type":"ContainerDied","Data":"c6c367380d8db5fd171c79a8671c037dadfcf9a955d7835d97f78c4c8d26a4a1"} Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.793183 4935 scope.go:117] "RemoveContainer" containerID="b59fef636375afb6a8b1c06ccfd489e69629eb432f6376cb2e3c477ba0298ae5" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.793362 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.801819 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" event={"ID":"e4ef2a77-ca25-495d-a00c-f15993955019","Type":"ContainerStarted","Data":"f9f81c7bd7caa8f47ae438cc7700a5f2ae2316454b721228406dd4d0eb236b59"} Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.831573 4935 generic.go:334] "Generic (PLEG): container finished" podID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" containerID="3530ad080d1edcab525f47d2d9a66c1e08f558003b8a420933927b17595a5589" exitCode=0 Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.831680 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f77988b8-57qwv" event={"ID":"6796ebed-95e9-4af3-a76f-0acc484ddbfb","Type":"ContainerDied","Data":"3530ad080d1edcab525f47d2d9a66c1e08f558003b8a420933927b17595a5589"} Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.877655 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfdadde0-a814-47bc-a025-db93796c38c4","Type":"ContainerStarted","Data":"a0484bb519b680c294895515c325a3eb9408e5e3bd6dd043cb96c150ff26f7cc"} Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.884516 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8fc8c65dd-5rdnt" event={"ID":"671ea9f9-7bde-4f11-8793-67f21663f8ec","Type":"ContainerStarted","Data":"0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409"} Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.886364 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.886387 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.910804 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69fff79bd9-55nbl" event={"ID":"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863","Type":"ContainerStarted","Data":"d4697911957e031d69d443233d9ed4ec139de0071b02bd0091576741173dd813"} Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.919120 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.935932 4935 generic.go:334] "Generic (PLEG): container finished" podID="7ae87ac8-5831-4714-bd2a-806b03f485aa" containerID="f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3" exitCode=0 Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.936001 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" event={"ID":"7ae87ac8-5831-4714-bd2a-806b03f485aa","Type":"ContainerDied","Data":"f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3"} Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.940908 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.980468 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:13 crc kubenswrapper[4935]: E1217 09:25:13.981545 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" containerName="horizon-log" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.981566 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" containerName="horizon-log" Dec 17 09:25:13 crc kubenswrapper[4935]: E1217 09:25:13.981582 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" containerName="horizon" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.981590 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" containerName="horizon" Dec 17 09:25:13 crc kubenswrapper[4935]: E1217 09:25:13.981602 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="ceilometer-notification-agent" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.981610 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="ceilometer-notification-agent" Dec 17 09:25:13 crc kubenswrapper[4935]: E1217 09:25:13.981634 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" containerName="horizon-log" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.981642 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" containerName="horizon-log" Dec 17 09:25:13 crc kubenswrapper[4935]: E1217 09:25:13.981669 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="sg-core" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.981676 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="sg-core" Dec 17 09:25:13 crc kubenswrapper[4935]: E1217 09:25:13.981692 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="ceilometer-central-agent" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.981700 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="ceilometer-central-agent" Dec 17 09:25:13 crc kubenswrapper[4935]: E1217 09:25:13.981714 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" containerName="horizon" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.981722 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" containerName="horizon" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.981987 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="sg-core" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.982011 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" containerName="horizon-log" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.982029 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" containerName="horizon-log" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.982042 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="ceilometer-central-agent" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.982057 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b865a47-3ef5-4de0-88f8-4eeaba5de2cc" containerName="horizon" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.982067 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" containerName="ceilometer-notification-agent" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.982085 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed5ba8df-52ae-47ef-ad50-8ccb05fa65d7" containerName="horizon" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.984025 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.984898 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podStartSLOduration=14.984886661 podStartE2EDuration="14.984886661s" podCreationTimestamp="2025-12-17 09:24:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:25:13.9222059 +0000 UTC m=+1233.582046663" watchObservedRunningTime="2025-12-17 09:25:13.984886661 +0000 UTC m=+1233.644727424" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.985965 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 17 09:25:13 crc kubenswrapper[4935]: I1217 09:25:13.991625 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:13.997837 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.112996 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxdzv\" (UniqueName: \"kubernetes.io/projected/73e821de-bc88-4508-a699-85eac867742d-kube-api-access-sxdzv\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.113131 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-log-httpd\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.113213 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.113253 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-scripts\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.113349 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.113401 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-config-data\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.113448 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-run-httpd\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.220878 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxdzv\" (UniqueName: \"kubernetes.io/projected/73e821de-bc88-4508-a699-85eac867742d-kube-api-access-sxdzv\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.220976 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-log-httpd\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.221028 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.221064 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-scripts\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.221104 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.221125 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-config-data\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.221156 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-run-httpd\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.227221 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-run-httpd\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.231731 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-log-httpd\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.234372 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-scripts\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.239200 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.240154 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-config-data\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.244144 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxdzv\" (UniqueName: \"kubernetes.io/projected/73e821de-bc88-4508-a699-85eac867742d-kube-api-access-sxdzv\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.253515 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.317861 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.510356 4935 scope.go:117] "RemoveContainer" containerID="0ba2fc44e91ee7ba44310f25eaf57f763e280313b2746fec7830e9ec68e45894" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.712137 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.727173 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.752226 4935 scope.go:117] "RemoveContainer" containerID="33e694debc362dac64987513367abf8519814666ab8bf4d62e9ac950f0001602" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.840790 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-httpd-config\") pod \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.840841 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-nb\") pod \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.840949 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-config\") pod \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.840975 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-svc\") pod \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.841054 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm9q6\" (UniqueName: \"kubernetes.io/projected/6796ebed-95e9-4af3-a76f-0acc484ddbfb-kube-api-access-gm9q6\") pod \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.841076 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-sb\") pod \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.841116 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-combined-ca-bundle\") pod \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.841149 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-swift-storage-0\") pod \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.841222 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-config\") pod \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.841254 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pspqc\" (UniqueName: \"kubernetes.io/projected/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-kube-api-access-pspqc\") pod \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\" (UID: \"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.841316 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-ovndb-tls-certs\") pod \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\" (UID: \"6796ebed-95e9-4af3-a76f-0acc484ddbfb\") " Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.877565 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6796ebed-95e9-4af3-a76f-0acc484ddbfb-kube-api-access-gm9q6" (OuterVolumeSpecName: "kube-api-access-gm9q6") pod "6796ebed-95e9-4af3-a76f-0acc484ddbfb" (UID: "6796ebed-95e9-4af3-a76f-0acc484ddbfb"). InnerVolumeSpecName "kube-api-access-gm9q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.888020 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-kube-api-access-pspqc" (OuterVolumeSpecName: "kube-api-access-pspqc") pod "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" (UID: "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4"). InnerVolumeSpecName "kube-api-access-pspqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.890484 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "6796ebed-95e9-4af3-a76f-0acc484ddbfb" (UID: "6796ebed-95e9-4af3-a76f-0acc484ddbfb"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:14 crc kubenswrapper[4935]: I1217 09:25:14.994165 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" (UID: "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.010927 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" (UID: "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.021245 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pspqc\" (UniqueName: \"kubernetes.io/projected/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-kube-api-access-pspqc\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.021309 4935 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.021322 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm9q6\" (UniqueName: \"kubernetes.io/projected/6796ebed-95e9-4af3-a76f-0acc484ddbfb-kube-api-access-gm9q6\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.021339 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.021349 4935 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.036922 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-config" (OuterVolumeSpecName: "config") pod "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" (UID: "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.064298 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" (UID: "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.067916 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" (UID: "6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.091668 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6796ebed-95e9-4af3-a76f-0acc484ddbfb" (UID: "6796ebed-95e9-4af3-a76f-0acc484ddbfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.113899 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" event={"ID":"6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4","Type":"ContainerDied","Data":"97c36558e9fe80dd708203be19e1ad451de7f7ce5d01484996a7c43b3e4b87be"} Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.113982 4935 scope.go:117] "RemoveContainer" containerID="03b2fbb557ad5068b1b9d736c30c813732acb8a964a30966917d75f68ff13c6e" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.114223 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bdf86f46f-2vpt9" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.147455 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-config" (OuterVolumeSpecName: "config") pod "6796ebed-95e9-4af3-a76f-0acc484ddbfb" (UID: "6796ebed-95e9-4af3-a76f-0acc484ddbfb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.163953 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.164012 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.164025 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.164034 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.164059 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.181745 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee747235-93a8-420f-95b6-232cdf8a3223" path="/var/lib/kubelet/pods/ee747235-93a8-420f-95b6-232cdf8a3223/volumes" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.196435 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" event={"ID":"e4ef2a77-ca25-495d-a00c-f15993955019","Type":"ContainerStarted","Data":"4c3df0118c145077401eb095ca5c83f8af564a6fcad9d6e57f71374283b84a68"} Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.196589 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.196613 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.236253 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" podStartSLOduration=12.23622768 podStartE2EDuration="12.23622768s" podCreationTimestamp="2025-12-17 09:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:25:15.229859315 +0000 UTC m=+1234.889700068" watchObservedRunningTime="2025-12-17 09:25:15.23622768 +0000 UTC m=+1234.896068443" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.238168 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.250933 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-84f77988b8-57qwv" event={"ID":"6796ebed-95e9-4af3-a76f-0acc484ddbfb","Type":"ContainerDied","Data":"44e8eda982bccbf6bc6fb8e8e1ba2a567770a6aba1fb17da49c2d1254b5ba516"} Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.252735 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-84f77988b8-57qwv" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.265443 4935 scope.go:117] "RemoveContainer" containerID="780433440bfe8714d3a013717e8a34ddf923ead49f1825269886dcb6621a5661" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.298169 4935 generic.go:334] "Generic (PLEG): container finished" podID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerID="a32917698924eed5e245fd899d7cd4530d8177e68ff09f8c11ed2b36000b275e" exitCode=1 Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.298229 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8fc8c65dd-5rdnt" event={"ID":"671ea9f9-7bde-4f11-8793-67f21663f8ec","Type":"ContainerDied","Data":"a32917698924eed5e245fd899d7cd4530d8177e68ff09f8c11ed2b36000b275e"} Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.299121 4935 scope.go:117] "RemoveContainer" containerID="a32917698924eed5e245fd899d7cd4530d8177e68ff09f8c11ed2b36000b275e" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.337572 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-2vpt9"] Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.353160 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bdf86f46f-2vpt9"] Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.360724 4935 scope.go:117] "RemoveContainer" containerID="3530ad080d1edcab525f47d2d9a66c1e08f558003b8a420933927b17595a5589" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.528604 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "6796ebed-95e9-4af3-a76f-0acc484ddbfb" (UID: "6796ebed-95e9-4af3-a76f-0acc484ddbfb"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.576810 4935 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6796ebed-95e9-4af3-a76f-0acc484ddbfb-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.736188 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-84f77988b8-57qwv"] Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.746058 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-84f77988b8-57qwv"] Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.885415 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7bbfb6547d-64jt7" Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.968951 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54c44548bb-26j2l"] Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.969250 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54c44548bb-26j2l" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon-log" containerID="cri-o://26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7" gracePeriod=30 Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.969743 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54c44548bb-26j2l" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" containerID="cri-o://0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1" gracePeriod=30 Dec 17 09:25:15 crc kubenswrapper[4935]: I1217 09:25:15.994060 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54c44548bb-26j2l" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.000908 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54c44548bb-26j2l" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.328354 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8fc8c65dd-5rdnt" event={"ID":"671ea9f9-7bde-4f11-8793-67f21663f8ec","Type":"ContainerStarted","Data":"f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153"} Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.328714 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.345174 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" event={"ID":"4798a48d-9fdb-40d8-9890-595874a05215","Type":"ContainerStarted","Data":"a4a6d95fe9e838f79089f6857bfe627809ebc9b5f42926f30abecdb61fc5d75c"} Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.345222 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" event={"ID":"4798a48d-9fdb-40d8-9890-595874a05215","Type":"ContainerStarted","Data":"7284d675825bae9c7dea567506971eec95aaca8ef4f84f5fd295f27747861bbb"} Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.355468 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" event={"ID":"7ae87ac8-5831-4714-bd2a-806b03f485aa","Type":"ContainerStarted","Data":"fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84"} Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.356606 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.370957 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69fff79bd9-55nbl" event={"ID":"93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863","Type":"ContainerStarted","Data":"43117485b85061fdcd24e6f731118b317cf9f70c384e27e9b8975c096379a46a"} Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.379653 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d273eade-a4a7-4f15-92e0-058d24689c84","Type":"ContainerStarted","Data":"8a03a966bd0672b4ed72604f5ea9e0ce6fa754b1b9e44becfc231ad8d3f9581b"} Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.394205 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73e821de-bc88-4508-a699-85eac867742d","Type":"ContainerStarted","Data":"98c8c9fcaa55332ef3fe88d95bfc86074244f59f80b3d36c8ca107bcec4b14f9"} Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.395283 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" podStartSLOduration=9.395249895 podStartE2EDuration="9.395249895s" podCreationTimestamp="2025-12-17 09:25:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:25:16.394825895 +0000 UTC m=+1236.054666658" watchObservedRunningTime="2025-12-17 09:25:16.395249895 +0000 UTC m=+1236.055090658" Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.428508 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5f4bb4f7d-ntfxs" podStartSLOduration=15.173163628 podStartE2EDuration="17.428483657s" podCreationTimestamp="2025-12-17 09:24:59 +0000 UTC" firstStartedPulling="2025-12-17 09:25:12.390571795 +0000 UTC m=+1232.050412558" lastFinishedPulling="2025-12-17 09:25:14.645891814 +0000 UTC m=+1234.305732587" observedRunningTime="2025-12-17 09:25:16.425139286 +0000 UTC m=+1236.084980049" watchObservedRunningTime="2025-12-17 09:25:16.428483657 +0000 UTC m=+1236.088324420" Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.431127 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bfdadde0-a814-47bc-a025-db93796c38c4" containerName="cinder-api-log" containerID="cri-o://a0484bb519b680c294895515c325a3eb9408e5e3bd6dd043cb96c150ff26f7cc" gracePeriod=30 Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.432403 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfdadde0-a814-47bc-a025-db93796c38c4","Type":"ContainerStarted","Data":"514cc1f9c3308173f69c7b81ca13995119f22f2b651a06ff50a0c37baff0eded"} Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.432471 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.432523 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bfdadde0-a814-47bc-a025-db93796c38c4" containerName="cinder-api" containerID="cri-o://514cc1f9c3308173f69c7b81ca13995119f22f2b651a06ff50a0c37baff0eded" gracePeriod=30 Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.474884 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-69fff79bd9-55nbl" podStartSLOduration=10.8249979 podStartE2EDuration="17.47486144s" podCreationTimestamp="2025-12-17 09:24:59 +0000 UTC" firstStartedPulling="2025-12-17 09:25:06.139013382 +0000 UTC m=+1225.798854145" lastFinishedPulling="2025-12-17 09:25:12.788876922 +0000 UTC m=+1232.448717685" observedRunningTime="2025-12-17 09:25:16.454773609 +0000 UTC m=+1236.114614372" watchObservedRunningTime="2025-12-17 09:25:16.47486144 +0000 UTC m=+1236.134702203" Dec 17 09:25:16 crc kubenswrapper[4935]: I1217 09:25:16.523999 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=8.523967759 podStartE2EDuration="8.523967759s" podCreationTimestamp="2025-12-17 09:25:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:25:16.479570254 +0000 UTC m=+1236.139411017" watchObservedRunningTime="2025-12-17 09:25:16.523967759 +0000 UTC m=+1236.183808522" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.137638 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" path="/var/lib/kubelet/pods/6796ebed-95e9-4af3-a76f-0acc484ddbfb/volumes" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.139033 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" path="/var/lib/kubelet/pods/6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4/volumes" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.462044 4935 generic.go:334] "Generic (PLEG): container finished" podID="bfdadde0-a814-47bc-a025-db93796c38c4" containerID="514cc1f9c3308173f69c7b81ca13995119f22f2b651a06ff50a0c37baff0eded" exitCode=0 Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.462095 4935 generic.go:334] "Generic (PLEG): container finished" podID="bfdadde0-a814-47bc-a025-db93796c38c4" containerID="a0484bb519b680c294895515c325a3eb9408e5e3bd6dd043cb96c150ff26f7cc" exitCode=143 Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.462156 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfdadde0-a814-47bc-a025-db93796c38c4","Type":"ContainerDied","Data":"514cc1f9c3308173f69c7b81ca13995119f22f2b651a06ff50a0c37baff0eded"} Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.462192 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfdadde0-a814-47bc-a025-db93796c38c4","Type":"ContainerDied","Data":"a0484bb519b680c294895515c325a3eb9408e5e3bd6dd043cb96c150ff26f7cc"} Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.475829 4935 generic.go:334] "Generic (PLEG): container finished" podID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerID="f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153" exitCode=1 Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.475937 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8fc8c65dd-5rdnt" event={"ID":"671ea9f9-7bde-4f11-8793-67f21663f8ec","Type":"ContainerDied","Data":"f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153"} Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.475990 4935 scope.go:117] "RemoveContainer" containerID="a32917698924eed5e245fd899d7cd4530d8177e68ff09f8c11ed2b36000b275e" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.478246 4935 scope.go:117] "RemoveContainer" containerID="f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153" Dec 17 09:25:17 crc kubenswrapper[4935]: E1217 09:25:17.478753 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-8fc8c65dd-5rdnt_openstack(671ea9f9-7bde-4f11-8793-67f21663f8ec)\"" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.509347 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d273eade-a4a7-4f15-92e0-058d24689c84","Type":"ContainerStarted","Data":"fd8e78f181102a463c35b87cad0742d06048e38d550bb0c211bfdb0079a94551"} Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.535155 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73e821de-bc88-4508-a699-85eac867742d","Type":"ContainerStarted","Data":"679a1bc116b0cee3e8c4d47f888a549e89f4f4f4fad87e14fd583b341f37ce12"} Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.568177 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=8.381600991 podStartE2EDuration="10.56814372s" podCreationTimestamp="2025-12-17 09:25:07 +0000 UTC" firstStartedPulling="2025-12-17 09:25:12.386792863 +0000 UTC m=+1232.046633636" lastFinishedPulling="2025-12-17 09:25:14.573335602 +0000 UTC m=+1234.233176365" observedRunningTime="2025-12-17 09:25:17.556796732 +0000 UTC m=+1237.216637495" watchObservedRunningTime="2025-12-17 09:25:17.56814372 +0000 UTC m=+1237.227984483" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.759918 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.850501 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-combined-ca-bundle\") pod \"bfdadde0-a814-47bc-a025-db93796c38c4\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.850583 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-scripts\") pod \"bfdadde0-a814-47bc-a025-db93796c38c4\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.850610 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data-custom\") pod \"bfdadde0-a814-47bc-a025-db93796c38c4\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.850688 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzj9x\" (UniqueName: \"kubernetes.io/projected/bfdadde0-a814-47bc-a025-db93796c38c4-kube-api-access-xzj9x\") pod \"bfdadde0-a814-47bc-a025-db93796c38c4\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.850729 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdadde0-a814-47bc-a025-db93796c38c4-logs\") pod \"bfdadde0-a814-47bc-a025-db93796c38c4\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.850855 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfdadde0-a814-47bc-a025-db93796c38c4-etc-machine-id\") pod \"bfdadde0-a814-47bc-a025-db93796c38c4\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.850880 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data\") pod \"bfdadde0-a814-47bc-a025-db93796c38c4\" (UID: \"bfdadde0-a814-47bc-a025-db93796c38c4\") " Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.854949 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfdadde0-a814-47bc-a025-db93796c38c4-logs" (OuterVolumeSpecName: "logs") pod "bfdadde0-a814-47bc-a025-db93796c38c4" (UID: "bfdadde0-a814-47bc-a025-db93796c38c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.861479 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfdadde0-a814-47bc-a025-db93796c38c4-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bfdadde0-a814-47bc-a025-db93796c38c4" (UID: "bfdadde0-a814-47bc-a025-db93796c38c4"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.863331 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfdadde0-a814-47bc-a025-db93796c38c4-kube-api-access-xzj9x" (OuterVolumeSpecName: "kube-api-access-xzj9x") pod "bfdadde0-a814-47bc-a025-db93796c38c4" (UID: "bfdadde0-a814-47bc-a025-db93796c38c4"). InnerVolumeSpecName "kube-api-access-xzj9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.871479 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bfdadde0-a814-47bc-a025-db93796c38c4" (UID: "bfdadde0-a814-47bc-a025-db93796c38c4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.874653 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-scripts" (OuterVolumeSpecName: "scripts") pod "bfdadde0-a814-47bc-a025-db93796c38c4" (UID: "bfdadde0-a814-47bc-a025-db93796c38c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.899407 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bfdadde0-a814-47bc-a025-db93796c38c4" (UID: "bfdadde0-a814-47bc-a025-db93796c38c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.954487 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.954530 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.954541 4935 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.954553 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzj9x\" (UniqueName: \"kubernetes.io/projected/bfdadde0-a814-47bc-a025-db93796c38c4-kube-api-access-xzj9x\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.954564 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bfdadde0-a814-47bc-a025-db93796c38c4-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.954575 4935 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bfdadde0-a814-47bc-a025-db93796c38c4-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:17 crc kubenswrapper[4935]: I1217 09:25:17.976512 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data" (OuterVolumeSpecName: "config-data") pod "bfdadde0-a814-47bc-a025-db93796c38c4" (UID: "bfdadde0-a814-47bc-a025-db93796c38c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.057086 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfdadde0-a814-47bc-a025-db93796c38c4-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.292671 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.294175 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": dial tcp 10.217.0.159:9311: connect: connection refused" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.546196 4935 scope.go:117] "RemoveContainer" containerID="f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.546452 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": dial tcp 10.217.0.159:9311: connect: connection refused" Dec 17 09:25:18 crc kubenswrapper[4935]: E1217 09:25:18.546477 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-8fc8c65dd-5rdnt_openstack(671ea9f9-7bde-4f11-8793-67f21663f8ec)\"" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.548696 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73e821de-bc88-4508-a699-85eac867742d","Type":"ContainerStarted","Data":"9715846b579f96180a870d2250f78e9a922a73afa7e198eb107b10e87e0e5317"} Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.551001 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bfdadde0-a814-47bc-a025-db93796c38c4","Type":"ContainerDied","Data":"625f7f611b1f29028338da8f9e96c8badc029378089f04cc6ccc27235f3f5919"} Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.551070 4935 scope.go:117] "RemoveContainer" containerID="514cc1f9c3308173f69c7b81ca13995119f22f2b651a06ff50a0c37baff0eded" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.551381 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.631347 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.651134 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.700206 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 17 09:25:18 crc kubenswrapper[4935]: E1217 09:25:18.701026 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdadde0-a814-47bc-a025-db93796c38c4" containerName="cinder-api-log" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.701150 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdadde0-a814-47bc-a025-db93796c38c4" containerName="cinder-api-log" Dec 17 09:25:18 crc kubenswrapper[4935]: E1217 09:25:18.701238 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" containerName="neutron-api" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.701338 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" containerName="neutron-api" Dec 17 09:25:18 crc kubenswrapper[4935]: E1217 09:25:18.704589 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfdadde0-a814-47bc-a025-db93796c38c4" containerName="cinder-api" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.704702 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfdadde0-a814-47bc-a025-db93796c38c4" containerName="cinder-api" Dec 17 09:25:18 crc kubenswrapper[4935]: E1217 09:25:18.704855 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" containerName="neutron-httpd" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.704933 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" containerName="neutron-httpd" Dec 17 09:25:18 crc kubenswrapper[4935]: E1217 09:25:18.705025 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" containerName="init" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.705100 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" containerName="init" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.705621 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfdadde0-a814-47bc-a025-db93796c38c4" containerName="cinder-api" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.705731 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfdadde0-a814-47bc-a025-db93796c38c4" containerName="cinder-api-log" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.705816 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" containerName="neutron-httpd" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.705915 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6796ebed-95e9-4af3-a76f-0acc484ddbfb" containerName="neutron-api" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.706010 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fd8f325-5e7f-4ce1-8a61-ad15d8503ed4" containerName="init" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.707649 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.712867 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.713117 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.713497 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.767361 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.804655 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.804733 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a66562-ec0b-4302-8e76-a4567917d90a-logs\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.804773 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.804820 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx974\" (UniqueName: \"kubernetes.io/projected/f6a66562-ec0b-4302-8e76-a4567917d90a-kube-api-access-cx974\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.804867 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.804913 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6a66562-ec0b-4302-8e76-a4567917d90a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.804928 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-scripts\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.804951 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-config-data\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.805006 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.909750 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6a66562-ec0b-4302-8e76-a4567917d90a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.909812 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-scripts\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.909850 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-config-data\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.909881 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.909911 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.909939 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a66562-ec0b-4302-8e76-a4567917d90a-logs\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.909973 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.910014 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx974\" (UniqueName: \"kubernetes.io/projected/f6a66562-ec0b-4302-8e76-a4567917d90a-kube-api-access-cx974\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.910055 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.911468 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f6a66562-ec0b-4302-8e76-a4567917d90a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.914400 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f6a66562-ec0b-4302-8e76-a4567917d90a-logs\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.922764 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-scripts\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.927172 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.928832 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.935385 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-config-data\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.935978 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-public-tls-certs\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.947523 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f6a66562-ec0b-4302-8e76-a4567917d90a-config-data-custom\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:18 crc kubenswrapper[4935]: I1217 09:25:18.963908 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx974\" (UniqueName: \"kubernetes.io/projected/f6a66562-ec0b-4302-8e76-a4567917d90a-kube-api-access-cx974\") pod \"cinder-api-0\" (UID: \"f6a66562-ec0b-4302-8e76-a4567917d90a\") " pod="openstack/cinder-api-0" Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.038848 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 17 09:25:19 crc kubenswrapper[4935]: E1217 09:25:19.136076 4935 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cca130e_5dbc_4edb_b0d9_f04a0ecd2ea6.slice/crio-conmon-60a63cdd5db0c8a1a4f78de46ef7939045b1c4415629aba4105851cdee5b0b57.scope\": RecentStats: unable to find data in memory cache]" Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.151257 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfdadde0-a814-47bc-a025-db93796c38c4" path="/var/lib/kubelet/pods/bfdadde0-a814-47bc-a025-db93796c38c4/volumes" Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.285565 4935 scope.go:117] "RemoveContainer" containerID="a0484bb519b680c294895515c325a3eb9408e5e3bd6dd043cb96c150ff26f7cc" Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.410319 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54c44548bb-26j2l" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:37816->10.217.0.146:8443: read: connection reset by peer" Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.498865 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.656966 4935 generic.go:334] "Generic (PLEG): container finished" podID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerID="0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1" exitCode=0 Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.657072 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c44548bb-26j2l" event={"ID":"5385d045-3f7c-447d-8ce8-d12a8de0cdce","Type":"ContainerDied","Data":"0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1"} Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.664255 4935 scope.go:117] "RemoveContainer" containerID="f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153" Dec 17 09:25:19 crc kubenswrapper[4935]: E1217 09:25:19.665012 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=barbican-api pod=barbican-api-8fc8c65dd-5rdnt_openstack(671ea9f9-7bde-4f11-8793-67f21663f8ec)\"" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.665909 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": dial tcp 10.217.0.159:9311: connect: connection refused" Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.730983 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:25:19 crc kubenswrapper[4935]: I1217 09:25:19.921428 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 17 09:25:20 crc kubenswrapper[4935]: I1217 09:25:20.208998 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5fc9d9db96-gh2fm" Dec 17 09:25:20 crc kubenswrapper[4935]: I1217 09:25:20.292803 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": dial tcp 10.217.0.159:9311: connect: connection refused" Dec 17 09:25:20 crc kubenswrapper[4935]: I1217 09:25:20.684595 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73e821de-bc88-4508-a699-85eac867742d","Type":"ContainerStarted","Data":"0acc78042528e26326a44346f2403ef147dcf641bfd0b5f2339cb6cc13083dd6"} Dec 17 09:25:20 crc kubenswrapper[4935]: I1217 09:25:20.690450 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f6a66562-ec0b-4302-8e76-a4567917d90a","Type":"ContainerStarted","Data":"ae9b2ccc828c1355186eabd5cdaca0e4105f3f9161ba86bc0ba4c55d6e16470b"} Dec 17 09:25:20 crc kubenswrapper[4935]: I1217 09:25:20.852979 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-548ff6dcf4-brlq9" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.275813 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.278019 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.285556 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.286158 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.286513 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-76p4v" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.304477 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": dial tcp 10.217.0.159:9311: connect: connection refused" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.307940 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.314791 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-openstack-config\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.314990 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.315019 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-openstack-config-secret\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.315047 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcllp\" (UniqueName: \"kubernetes.io/projected/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-kube-api-access-dcllp\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.418431 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.418518 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-openstack-config-secret\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.418574 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcllp\" (UniqueName: \"kubernetes.io/projected/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-kube-api-access-dcllp\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.418653 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-openstack-config\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.420981 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-openstack-config\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.425986 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-combined-ca-bundle\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.426899 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-openstack-config-secret\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.439661 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcllp\" (UniqueName: \"kubernetes.io/projected/31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0-kube-api-access-dcllp\") pod \"openstackclient\" (UID: \"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0\") " pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.668768 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 17 09:25:21 crc kubenswrapper[4935]: I1217 09:25:21.722600 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f6a66562-ec0b-4302-8e76-a4567917d90a","Type":"ContainerStarted","Data":"a30c725f277137bc2c7de9d2a940d7a9022565b204c5659b5661ddbd5bd6e18d"} Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.359857 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 17 09:25:22 crc kubenswrapper[4935]: W1217 09:25:22.379661 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f0e1f2_aecd_4e5e_96a6_deeee6e7bdb0.slice/crio-7787bdf0c90aa4c1ffc9d88641fd05a657c5c5eae7160781c694bd8f72401e54 WatchSource:0}: Error finding container 7787bdf0c90aa4c1ffc9d88641fd05a657c5c5eae7160781c694bd8f72401e54: Status 404 returned error can't find the container with id 7787bdf0c90aa4c1ffc9d88641fd05a657c5c5eae7160781c694bd8f72401e54 Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.505263 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.725694 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.737481 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73e821de-bc88-4508-a699-85eac867742d","Type":"ContainerStarted","Data":"8a18f97061cd8c12313bc57a4887bfddf513e7a684a4420f301333b179398bd1"} Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.737876 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.739136 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0","Type":"ContainerStarted","Data":"7787bdf0c90aa4c1ffc9d88641fd05a657c5c5eae7160781c694bd8f72401e54"} Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.759509 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f6a66562-ec0b-4302-8e76-a4567917d90a","Type":"ContainerStarted","Data":"feac22e6e1a168a9ad1306481646c0c85ac9fa84272897bc29b9bd03d2d97b88"} Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.759620 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.862647 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.37540416 podStartE2EDuration="9.862622878s" podCreationTimestamp="2025-12-17 09:25:13 +0000 UTC" firstStartedPulling="2025-12-17 09:25:15.27389767 +0000 UTC m=+1234.933738433" lastFinishedPulling="2025-12-17 09:25:21.761116388 +0000 UTC m=+1241.420957151" observedRunningTime="2025-12-17 09:25:22.831738113 +0000 UTC m=+1242.491578876" watchObservedRunningTime="2025-12-17 09:25:22.862622878 +0000 UTC m=+1242.522463641" Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.916645 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8fc8c65dd-5rdnt"] Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.916974 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api-log" containerID="cri-o://0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409" gracePeriod=30 Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.917948 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8fc8c65dd-5rdnt" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.159:9311/healthcheck\": dial tcp 10.217.0.159:9311: connect: connection refused" Dec 17 09:25:22 crc kubenswrapper[4935]: I1217 09:25:22.927813 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.927783429 podStartE2EDuration="4.927783429s" podCreationTimestamp="2025-12-17 09:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:25:22.872964231 +0000 UTC m=+1242.532804994" watchObservedRunningTime="2025-12-17 09:25:22.927783429 +0000 UTC m=+1242.587624192" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.389603 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-5b5f4b44fb-ktsl4" podUID="e4ef2a77-ca25-495d-a00c-f15993955019" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.160:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.546333 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.662508 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.733465 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data-custom\") pod \"671ea9f9-7bde-4f11-8793-67f21663f8ec\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.733546 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-combined-ca-bundle\") pod \"671ea9f9-7bde-4f11-8793-67f21663f8ec\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.733598 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/671ea9f9-7bde-4f11-8793-67f21663f8ec-logs\") pod \"671ea9f9-7bde-4f11-8793-67f21663f8ec\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.733635 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data\") pod \"671ea9f9-7bde-4f11-8793-67f21663f8ec\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.733672 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkqsp\" (UniqueName: \"kubernetes.io/projected/671ea9f9-7bde-4f11-8793-67f21663f8ec-kube-api-access-dkqsp\") pod \"671ea9f9-7bde-4f11-8793-67f21663f8ec\" (UID: \"671ea9f9-7bde-4f11-8793-67f21663f8ec\") " Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.738231 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/671ea9f9-7bde-4f11-8793-67f21663f8ec-logs" (OuterVolumeSpecName: "logs") pod "671ea9f9-7bde-4f11-8793-67f21663f8ec" (UID: "671ea9f9-7bde-4f11-8793-67f21663f8ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.752827 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/671ea9f9-7bde-4f11-8793-67f21663f8ec-kube-api-access-dkqsp" (OuterVolumeSpecName: "kube-api-access-dkqsp") pod "671ea9f9-7bde-4f11-8793-67f21663f8ec" (UID: "671ea9f9-7bde-4f11-8793-67f21663f8ec"). InnerVolumeSpecName "kube-api-access-dkqsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.774784 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "671ea9f9-7bde-4f11-8793-67f21663f8ec" (UID: "671ea9f9-7bde-4f11-8793-67f21663f8ec"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.800557 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-knwqw"] Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.800878 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" podUID="95378004-259e-4ed5-b3ac-2e7870949a91" containerName="dnsmasq-dns" containerID="cri-o://4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70" gracePeriod=10 Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.805576 4935 generic.go:334] "Generic (PLEG): container finished" podID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerID="0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409" exitCode=143 Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.805755 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8fc8c65dd-5rdnt" event={"ID":"671ea9f9-7bde-4f11-8793-67f21663f8ec","Type":"ContainerDied","Data":"0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409"} Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.806419 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8fc8c65dd-5rdnt" event={"ID":"671ea9f9-7bde-4f11-8793-67f21663f8ec","Type":"ContainerDied","Data":"ff66392d9438c5568e9325aaef17830f6f3789b0feb10955ce3fc4ab04e2de88"} Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.806541 4935 scope.go:117] "RemoveContainer" containerID="f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.809186 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8fc8c65dd-5rdnt" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.835960 4935 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.836004 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/671ea9f9-7bde-4f11-8793-67f21663f8ec-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.836025 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkqsp\" (UniqueName: \"kubernetes.io/projected/671ea9f9-7bde-4f11-8793-67f21663f8ec-kube-api-access-dkqsp\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.871629 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "671ea9f9-7bde-4f11-8793-67f21663f8ec" (UID: "671ea9f9-7bde-4f11-8793-67f21663f8ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.909393 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data" (OuterVolumeSpecName: "config-data") pod "671ea9f9-7bde-4f11-8793-67f21663f8ec" (UID: "671ea9f9-7bde-4f11-8793-67f21663f8ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.940447 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:23 crc kubenswrapper[4935]: I1217 09:25:23.940487 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/671ea9f9-7bde-4f11-8793-67f21663f8ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.065558 4935 scope.go:117] "RemoveContainer" containerID="0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.132344 4935 scope.go:117] "RemoveContainer" containerID="f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153" Dec 17 09:25:24 crc kubenswrapper[4935]: E1217 09:25:24.133367 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153\": container with ID starting with f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153 not found: ID does not exist" containerID="f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.133399 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153"} err="failed to get container status \"f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153\": rpc error: code = NotFound desc = could not find container \"f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153\": container with ID starting with f56074bcaa37f688e4e07bd07a76cdf934fd3c36d529f03193ea5d2e97bd2153 not found: ID does not exist" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.133427 4935 scope.go:117] "RemoveContainer" containerID="0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409" Dec 17 09:25:24 crc kubenswrapper[4935]: E1217 09:25:24.133881 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409\": container with ID starting with 0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409 not found: ID does not exist" containerID="0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.133917 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409"} err="failed to get container status \"0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409\": rpc error: code = NotFound desc = could not find container \"0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409\": container with ID starting with 0c60a2b8486f5710bf1c931d06fd5c34be764ac45159c867521c2c08cf8b4409 not found: ID does not exist" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.172533 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8fc8c65dd-5rdnt"] Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.186187 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8fc8c65dd-5rdnt"] Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.408871 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.555722 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-config\") pod \"95378004-259e-4ed5-b3ac-2e7870949a91\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.555866 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-swift-storage-0\") pod \"95378004-259e-4ed5-b3ac-2e7870949a91\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.555921 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx2tw\" (UniqueName: \"kubernetes.io/projected/95378004-259e-4ed5-b3ac-2e7870949a91-kube-api-access-xx2tw\") pod \"95378004-259e-4ed5-b3ac-2e7870949a91\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.555983 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-sb\") pod \"95378004-259e-4ed5-b3ac-2e7870949a91\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.556028 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-svc\") pod \"95378004-259e-4ed5-b3ac-2e7870949a91\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.556150 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-nb\") pod \"95378004-259e-4ed5-b3ac-2e7870949a91\" (UID: \"95378004-259e-4ed5-b3ac-2e7870949a91\") " Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.593558 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95378004-259e-4ed5-b3ac-2e7870949a91-kube-api-access-xx2tw" (OuterVolumeSpecName: "kube-api-access-xx2tw") pod "95378004-259e-4ed5-b3ac-2e7870949a91" (UID: "95378004-259e-4ed5-b3ac-2e7870949a91"). InnerVolumeSpecName "kube-api-access-xx2tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.652608 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-config" (OuterVolumeSpecName: "config") pod "95378004-259e-4ed5-b3ac-2e7870949a91" (UID: "95378004-259e-4ed5-b3ac-2e7870949a91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.658996 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx2tw\" (UniqueName: \"kubernetes.io/projected/95378004-259e-4ed5-b3ac-2e7870949a91-kube-api-access-xx2tw\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.659030 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.673946 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "95378004-259e-4ed5-b3ac-2e7870949a91" (UID: "95378004-259e-4ed5-b3ac-2e7870949a91"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.682051 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "95378004-259e-4ed5-b3ac-2e7870949a91" (UID: "95378004-259e-4ed5-b3ac-2e7870949a91"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.699950 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "95378004-259e-4ed5-b3ac-2e7870949a91" (UID: "95378004-259e-4ed5-b3ac-2e7870949a91"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.703943 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "95378004-259e-4ed5-b3ac-2e7870949a91" (UID: "95378004-259e-4ed5-b3ac-2e7870949a91"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.762725 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.762768 4935 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.762779 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.762792 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95378004-259e-4ed5-b3ac-2e7870949a91-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.839626 4935 generic.go:334] "Generic (PLEG): container finished" podID="95378004-259e-4ed5-b3ac-2e7870949a91" containerID="4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70" exitCode=0 Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.839686 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" event={"ID":"95378004-259e-4ed5-b3ac-2e7870949a91","Type":"ContainerDied","Data":"4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70"} Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.839720 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" event={"ID":"95378004-259e-4ed5-b3ac-2e7870949a91","Type":"ContainerDied","Data":"1cd71e05e56d827006399a109755b4ce1e84eb9388d63ccc61a7f52c1bbf76fe"} Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.839741 4935 scope.go:117] "RemoveContainer" containerID="4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.839890 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-knwqw" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.897501 4935 scope.go:117] "RemoveContainer" containerID="e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.914542 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-knwqw"] Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.925899 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-knwqw"] Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.963257 4935 scope.go:117] "RemoveContainer" containerID="4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70" Dec 17 09:25:24 crc kubenswrapper[4935]: E1217 09:25:24.964483 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70\": container with ID starting with 4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70 not found: ID does not exist" containerID="4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.964536 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70"} err="failed to get container status \"4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70\": rpc error: code = NotFound desc = could not find container \"4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70\": container with ID starting with 4d76d76a6da329b58b1bf6dcc1c394edf3f4ab39d623b0c7b2f3bfe0bac48a70 not found: ID does not exist" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.964583 4935 scope.go:117] "RemoveContainer" containerID="e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647" Dec 17 09:25:24 crc kubenswrapper[4935]: E1217 09:25:24.966157 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647\": container with ID starting with e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647 not found: ID does not exist" containerID="e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647" Dec 17 09:25:24 crc kubenswrapper[4935]: I1217 09:25:24.966191 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647"} err="failed to get container status \"e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647\": rpc error: code = NotFound desc = could not find container \"e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647\": container with ID starting with e37702a4591e1b06a4af358c7c3a836b2d32ad248f8dfc711604440b461ee647 not found: ID does not exist" Dec 17 09:25:25 crc kubenswrapper[4935]: I1217 09:25:25.140847 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" path="/var/lib/kubelet/pods/671ea9f9-7bde-4f11-8793-67f21663f8ec/volumes" Dec 17 09:25:25 crc kubenswrapper[4935]: I1217 09:25:25.141590 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95378004-259e-4ed5-b3ac-2e7870949a91" path="/var/lib/kubelet/pods/95378004-259e-4ed5-b3ac-2e7870949a91/volumes" Dec 17 09:25:25 crc kubenswrapper[4935]: I1217 09:25:25.142393 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 17 09:25:25 crc kubenswrapper[4935]: I1217 09:25:25.227423 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 17 09:25:25 crc kubenswrapper[4935]: I1217 09:25:25.855761 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d273eade-a4a7-4f15-92e0-058d24689c84" containerName="cinder-scheduler" containerID="cri-o://8a03a966bd0672b4ed72604f5ea9e0ce6fa754b1b9e44becfc231ad8d3f9581b" gracePeriod=30 Dec 17 09:25:25 crc kubenswrapper[4935]: I1217 09:25:25.856709 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="d273eade-a4a7-4f15-92e0-058d24689c84" containerName="probe" containerID="cri-o://fd8e78f181102a463c35b87cad0742d06048e38d550bb0c211bfdb0079a94551" gracePeriod=30 Dec 17 09:25:26 crc kubenswrapper[4935]: I1217 09:25:26.867258 4935 generic.go:334] "Generic (PLEG): container finished" podID="d273eade-a4a7-4f15-92e0-058d24689c84" containerID="fd8e78f181102a463c35b87cad0742d06048e38d550bb0c211bfdb0079a94551" exitCode=0 Dec 17 09:25:26 crc kubenswrapper[4935]: I1217 09:25:26.867309 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d273eade-a4a7-4f15-92e0-058d24689c84","Type":"ContainerDied","Data":"fd8e78f181102a463c35b87cad0742d06048e38d550bb0c211bfdb0079a94551"} Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.681812 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-767f646f97-whvbb"] Dec 17 09:25:28 crc kubenswrapper[4935]: E1217 09:25:28.682659 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.682675 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api" Dec 17 09:25:28 crc kubenswrapper[4935]: E1217 09:25:28.682691 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95378004-259e-4ed5-b3ac-2e7870949a91" containerName="dnsmasq-dns" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.682699 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="95378004-259e-4ed5-b3ac-2e7870949a91" containerName="dnsmasq-dns" Dec 17 09:25:28 crc kubenswrapper[4935]: E1217 09:25:28.682718 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.682725 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api" Dec 17 09:25:28 crc kubenswrapper[4935]: E1217 09:25:28.682742 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95378004-259e-4ed5-b3ac-2e7870949a91" containerName="init" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.682748 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="95378004-259e-4ed5-b3ac-2e7870949a91" containerName="init" Dec 17 09:25:28 crc kubenswrapper[4935]: E1217 09:25:28.682757 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api-log" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.682763 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api-log" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.682943 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="95378004-259e-4ed5-b3ac-2e7870949a91" containerName="dnsmasq-dns" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.682966 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api-log" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.682974 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.683371 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="671ea9f9-7bde-4f11-8793-67f21663f8ec" containerName="barbican-api" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.685091 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.696932 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.696927 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.697145 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.703319 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-767f646f97-whvbb"] Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.876142 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-internal-tls-certs\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.876223 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/562d41ed-7767-48a4-9cf4-84405e6deb48-etc-swift\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.876251 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-combined-ca-bundle\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.876401 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-public-tls-certs\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.876477 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-config-data\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.876550 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/562d41ed-7767-48a4-9cf4-84405e6deb48-log-httpd\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.876572 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/562d41ed-7767-48a4-9cf4-84405e6deb48-run-httpd\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.876590 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj6sw\" (UniqueName: \"kubernetes.io/projected/562d41ed-7767-48a4-9cf4-84405e6deb48-kube-api-access-cj6sw\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.979197 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-config-data\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.979404 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/562d41ed-7767-48a4-9cf4-84405e6deb48-log-httpd\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.979428 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/562d41ed-7767-48a4-9cf4-84405e6deb48-run-httpd\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.979465 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj6sw\" (UniqueName: \"kubernetes.io/projected/562d41ed-7767-48a4-9cf4-84405e6deb48-kube-api-access-cj6sw\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.979544 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-internal-tls-certs\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.979620 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/562d41ed-7767-48a4-9cf4-84405e6deb48-etc-swift\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.979645 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-combined-ca-bundle\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.979709 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-public-tls-certs\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.980260 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/562d41ed-7767-48a4-9cf4-84405e6deb48-log-httpd\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.980825 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/562d41ed-7767-48a4-9cf4-84405e6deb48-run-httpd\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.987024 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-internal-tls-certs\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.988945 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/562d41ed-7767-48a4-9cf4-84405e6deb48-etc-swift\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.989088 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-combined-ca-bundle\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.991296 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-public-tls-certs\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:28 crc kubenswrapper[4935]: I1217 09:25:28.991733 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/562d41ed-7767-48a4-9cf4-84405e6deb48-config-data\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.007166 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj6sw\" (UniqueName: \"kubernetes.io/projected/562d41ed-7767-48a4-9cf4-84405e6deb48-kube-api-access-cj6sw\") pod \"swift-proxy-767f646f97-whvbb\" (UID: \"562d41ed-7767-48a4-9cf4-84405e6deb48\") " pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.063154 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54c44548bb-26j2l" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.309420 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.421449 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.421837 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="ceilometer-central-agent" containerID="cri-o://679a1bc116b0cee3e8c4d47f888a549e89f4f4f4fad87e14fd583b341f37ce12" gracePeriod=30 Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.422356 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="proxy-httpd" containerID="cri-o://8a18f97061cd8c12313bc57a4887bfddf513e7a684a4420f301333b179398bd1" gracePeriod=30 Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.422502 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="ceilometer-notification-agent" containerID="cri-o://9715846b579f96180a870d2250f78e9a922a73afa7e198eb107b10e87e0e5317" gracePeriod=30 Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.422571 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="sg-core" containerID="cri-o://0acc78042528e26326a44346f2403ef147dcf641bfd0b5f2339cb6cc13083dd6" gracePeriod=30 Dec 17 09:25:29 crc kubenswrapper[4935]: E1217 09:25:29.479385 4935 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cca130e_5dbc_4edb_b0d9_f04a0ecd2ea6.slice/crio-conmon-60a63cdd5db0c8a1a4f78de46ef7939045b1c4415629aba4105851cdee5b0b57.scope\": RecentStats: unable to find data in memory cache]" Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.914117 4935 generic.go:334] "Generic (PLEG): container finished" podID="73e821de-bc88-4508-a699-85eac867742d" containerID="8a18f97061cd8c12313bc57a4887bfddf513e7a684a4420f301333b179398bd1" exitCode=0 Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.914628 4935 generic.go:334] "Generic (PLEG): container finished" podID="73e821de-bc88-4508-a699-85eac867742d" containerID="0acc78042528e26326a44346f2403ef147dcf641bfd0b5f2339cb6cc13083dd6" exitCode=2 Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.914191 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73e821de-bc88-4508-a699-85eac867742d","Type":"ContainerDied","Data":"8a18f97061cd8c12313bc57a4887bfddf513e7a684a4420f301333b179398bd1"} Dec 17 09:25:29 crc kubenswrapper[4935]: I1217 09:25:29.914671 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73e821de-bc88-4508-a699-85eac867742d","Type":"ContainerDied","Data":"0acc78042528e26326a44346f2403ef147dcf641bfd0b5f2339cb6cc13083dd6"} Dec 17 09:25:30 crc kubenswrapper[4935]: I1217 09:25:30.932506 4935 generic.go:334] "Generic (PLEG): container finished" podID="d273eade-a4a7-4f15-92e0-058d24689c84" containerID="8a03a966bd0672b4ed72604f5ea9e0ce6fa754b1b9e44becfc231ad8d3f9581b" exitCode=0 Dec 17 09:25:30 crc kubenswrapper[4935]: I1217 09:25:30.932607 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d273eade-a4a7-4f15-92e0-058d24689c84","Type":"ContainerDied","Data":"8a03a966bd0672b4ed72604f5ea9e0ce6fa754b1b9e44becfc231ad8d3f9581b"} Dec 17 09:25:30 crc kubenswrapper[4935]: I1217 09:25:30.936132 4935 generic.go:334] "Generic (PLEG): container finished" podID="73e821de-bc88-4508-a699-85eac867742d" containerID="9715846b579f96180a870d2250f78e9a922a73afa7e198eb107b10e87e0e5317" exitCode=0 Dec 17 09:25:30 crc kubenswrapper[4935]: I1217 09:25:30.936171 4935 generic.go:334] "Generic (PLEG): container finished" podID="73e821de-bc88-4508-a699-85eac867742d" containerID="679a1bc116b0cee3e8c4d47f888a549e89f4f4f4fad87e14fd583b341f37ce12" exitCode=0 Dec 17 09:25:30 crc kubenswrapper[4935]: I1217 09:25:30.936203 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73e821de-bc88-4508-a699-85eac867742d","Type":"ContainerDied","Data":"9715846b579f96180a870d2250f78e9a922a73afa7e198eb107b10e87e0e5317"} Dec 17 09:25:30 crc kubenswrapper[4935]: I1217 09:25:30.936240 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73e821de-bc88-4508-a699-85eac867742d","Type":"ContainerDied","Data":"679a1bc116b0cee3e8c4d47f888a549e89f4f4f4fad87e14fd583b341f37ce12"} Dec 17 09:25:31 crc kubenswrapper[4935]: I1217 09:25:31.537762 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 17 09:25:32 crc kubenswrapper[4935]: I1217 09:25:32.813594 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-ttcj9"] Dec 17 09:25:32 crc kubenswrapper[4935]: I1217 09:25:32.815470 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ttcj9" Dec 17 09:25:32 crc kubenswrapper[4935]: I1217 09:25:32.871820 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ttcj9"] Dec 17 09:25:32 crc kubenswrapper[4935]: I1217 09:25:32.929312 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-k26rm"] Dec 17 09:25:32 crc kubenswrapper[4935]: I1217 09:25:32.931492 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k26rm" Dec 17 09:25:32 crc kubenswrapper[4935]: I1217 09:25:32.939118 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k26rm"] Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:32.994056 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pghv\" (UniqueName: \"kubernetes.io/projected/a3a20517-d9a8-4082-bd71-51790e3e4e89-kube-api-access-4pghv\") pod \"nova-api-db-create-ttcj9\" (UID: \"a3a20517-d9a8-4082-bd71-51790e3e4e89\") " pod="openstack/nova-api-db-create-ttcj9" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.002964 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3a20517-d9a8-4082-bd71-51790e3e4e89-operator-scripts\") pod \"nova-api-db-create-ttcj9\" (UID: \"a3a20517-d9a8-4082-bd71-51790e3e4e89\") " pod="openstack/nova-api-db-create-ttcj9" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.038218 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-36ba-account-create-update-vkfrb"] Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.040446 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-36ba-account-create-update-vkfrb" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.043689 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.059359 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-36ba-account-create-update-vkfrb"] Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.105406 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3a20517-d9a8-4082-bd71-51790e3e4e89-operator-scripts\") pod \"nova-api-db-create-ttcj9\" (UID: \"a3a20517-d9a8-4082-bd71-51790e3e4e89\") " pod="openstack/nova-api-db-create-ttcj9" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.105520 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk27q\" (UniqueName: \"kubernetes.io/projected/47357892-adb6-48a7-99e9-b464b90d60db-kube-api-access-fk27q\") pod \"nova-cell0-db-create-k26rm\" (UID: \"47357892-adb6-48a7-99e9-b464b90d60db\") " pod="openstack/nova-cell0-db-create-k26rm" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.105592 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pghv\" (UniqueName: \"kubernetes.io/projected/a3a20517-d9a8-4082-bd71-51790e3e4e89-kube-api-access-4pghv\") pod \"nova-api-db-create-ttcj9\" (UID: \"a3a20517-d9a8-4082-bd71-51790e3e4e89\") " pod="openstack/nova-api-db-create-ttcj9" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.105642 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47357892-adb6-48a7-99e9-b464b90d60db-operator-scripts\") pod \"nova-cell0-db-create-k26rm\" (UID: \"47357892-adb6-48a7-99e9-b464b90d60db\") " pod="openstack/nova-cell0-db-create-k26rm" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.106419 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3a20517-d9a8-4082-bd71-51790e3e4e89-operator-scripts\") pod \"nova-api-db-create-ttcj9\" (UID: \"a3a20517-d9a8-4082-bd71-51790e3e4e89\") " pod="openstack/nova-api-db-create-ttcj9" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.142311 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tf769"] Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.143916 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tf769" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.172197 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pghv\" (UniqueName: \"kubernetes.io/projected/a3a20517-d9a8-4082-bd71-51790e3e4e89-kube-api-access-4pghv\") pod \"nova-api-db-create-ttcj9\" (UID: \"a3a20517-d9a8-4082-bd71-51790e3e4e89\") " pod="openstack/nova-api-db-create-ttcj9" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.172316 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tf769"] Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.209873 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v24z4\" (UniqueName: \"kubernetes.io/projected/1abecb40-40b9-4579-949a-40de63a9f65c-kube-api-access-v24z4\") pod \"nova-api-36ba-account-create-update-vkfrb\" (UID: \"1abecb40-40b9-4579-949a-40de63a9f65c\") " pod="openstack/nova-api-36ba-account-create-update-vkfrb" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.209954 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abecb40-40b9-4579-949a-40de63a9f65c-operator-scripts\") pod \"nova-api-36ba-account-create-update-vkfrb\" (UID: \"1abecb40-40b9-4579-949a-40de63a9f65c\") " pod="openstack/nova-api-36ba-account-create-update-vkfrb" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.210008 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47357892-adb6-48a7-99e9-b464b90d60db-operator-scripts\") pod \"nova-cell0-db-create-k26rm\" (UID: \"47357892-adb6-48a7-99e9-b464b90d60db\") " pod="openstack/nova-cell0-db-create-k26rm" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.210137 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk27q\" (UniqueName: \"kubernetes.io/projected/47357892-adb6-48a7-99e9-b464b90d60db-kube-api-access-fk27q\") pod \"nova-cell0-db-create-k26rm\" (UID: \"47357892-adb6-48a7-99e9-b464b90d60db\") " pod="openstack/nova-cell0-db-create-k26rm" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.211140 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47357892-adb6-48a7-99e9-b464b90d60db-operator-scripts\") pod \"nova-cell0-db-create-k26rm\" (UID: \"47357892-adb6-48a7-99e9-b464b90d60db\") " pod="openstack/nova-cell0-db-create-k26rm" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.211204 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-fc5c-account-create-update-27xgk"] Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.212731 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.214964 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.228183 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fc5c-account-create-update-27xgk"] Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.244901 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk27q\" (UniqueName: \"kubernetes.io/projected/47357892-adb6-48a7-99e9-b464b90d60db-kube-api-access-fk27q\") pod \"nova-cell0-db-create-k26rm\" (UID: \"47357892-adb6-48a7-99e9-b464b90d60db\") " pod="openstack/nova-cell0-db-create-k26rm" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.263618 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k26rm" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.313609 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abecb40-40b9-4579-949a-40de63a9f65c-operator-scripts\") pod \"nova-api-36ba-account-create-update-vkfrb\" (UID: \"1abecb40-40b9-4579-949a-40de63a9f65c\") " pod="openstack/nova-api-36ba-account-create-update-vkfrb" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.313700 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4e49d6-e019-40a7-873b-89956a1bf3c9-operator-scripts\") pod \"nova-cell1-db-create-tf769\" (UID: \"be4e49d6-e019-40a7-873b-89956a1bf3c9\") " pod="openstack/nova-cell1-db-create-tf769" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.313835 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/987fbded-c214-4ec4-8a67-3c79ded79782-operator-scripts\") pod \"nova-cell0-fc5c-account-create-update-27xgk\" (UID: \"987fbded-c214-4ec4-8a67-3c79ded79782\") " pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.314086 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrx9w\" (UniqueName: \"kubernetes.io/projected/be4e49d6-e019-40a7-873b-89956a1bf3c9-kube-api-access-xrx9w\") pod \"nova-cell1-db-create-tf769\" (UID: \"be4e49d6-e019-40a7-873b-89956a1bf3c9\") " pod="openstack/nova-cell1-db-create-tf769" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.314187 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5w7z\" (UniqueName: \"kubernetes.io/projected/987fbded-c214-4ec4-8a67-3c79ded79782-kube-api-access-d5w7z\") pod \"nova-cell0-fc5c-account-create-update-27xgk\" (UID: \"987fbded-c214-4ec4-8a67-3c79ded79782\") " pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.314374 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v24z4\" (UniqueName: \"kubernetes.io/projected/1abecb40-40b9-4579-949a-40de63a9f65c-kube-api-access-v24z4\") pod \"nova-api-36ba-account-create-update-vkfrb\" (UID: \"1abecb40-40b9-4579-949a-40de63a9f65c\") " pod="openstack/nova-api-36ba-account-create-update-vkfrb" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.315872 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abecb40-40b9-4579-949a-40de63a9f65c-operator-scripts\") pod \"nova-api-36ba-account-create-update-vkfrb\" (UID: \"1abecb40-40b9-4579-949a-40de63a9f65c\") " pod="openstack/nova-api-36ba-account-create-update-vkfrb" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.333325 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v24z4\" (UniqueName: \"kubernetes.io/projected/1abecb40-40b9-4579-949a-40de63a9f65c-kube-api-access-v24z4\") pod \"nova-api-36ba-account-create-update-vkfrb\" (UID: \"1abecb40-40b9-4579-949a-40de63a9f65c\") " pod="openstack/nova-api-36ba-account-create-update-vkfrb" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.362844 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-36ba-account-create-update-vkfrb" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.415848 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrx9w\" (UniqueName: \"kubernetes.io/projected/be4e49d6-e019-40a7-873b-89956a1bf3c9-kube-api-access-xrx9w\") pod \"nova-cell1-db-create-tf769\" (UID: \"be4e49d6-e019-40a7-873b-89956a1bf3c9\") " pod="openstack/nova-cell1-db-create-tf769" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.415908 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5w7z\" (UniqueName: \"kubernetes.io/projected/987fbded-c214-4ec4-8a67-3c79ded79782-kube-api-access-d5w7z\") pod \"nova-cell0-fc5c-account-create-update-27xgk\" (UID: \"987fbded-c214-4ec4-8a67-3c79ded79782\") " pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.416014 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4e49d6-e019-40a7-873b-89956a1bf3c9-operator-scripts\") pod \"nova-cell1-db-create-tf769\" (UID: \"be4e49d6-e019-40a7-873b-89956a1bf3c9\") " pod="openstack/nova-cell1-db-create-tf769" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.416066 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/987fbded-c214-4ec4-8a67-3c79ded79782-operator-scripts\") pod \"nova-cell0-fc5c-account-create-update-27xgk\" (UID: \"987fbded-c214-4ec4-8a67-3c79ded79782\") " pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.417127 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/987fbded-c214-4ec4-8a67-3c79ded79782-operator-scripts\") pod \"nova-cell0-fc5c-account-create-update-27xgk\" (UID: \"987fbded-c214-4ec4-8a67-3c79ded79782\") " pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.418163 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4e49d6-e019-40a7-873b-89956a1bf3c9-operator-scripts\") pod \"nova-cell1-db-create-tf769\" (UID: \"be4e49d6-e019-40a7-873b-89956a1bf3c9\") " pod="openstack/nova-cell1-db-create-tf769" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.432010 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8905-account-create-update-d7k5w"] Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.433230 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8905-account-create-update-d7k5w" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.440828 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8905-account-create-update-d7k5w"] Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.441381 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.444236 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5w7z\" (UniqueName: \"kubernetes.io/projected/987fbded-c214-4ec4-8a67-3c79ded79782-kube-api-access-d5w7z\") pod \"nova-cell0-fc5c-account-create-update-27xgk\" (UID: \"987fbded-c214-4ec4-8a67-3c79ded79782\") " pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.444839 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrx9w\" (UniqueName: \"kubernetes.io/projected/be4e49d6-e019-40a7-873b-89956a1bf3c9-kube-api-access-xrx9w\") pod \"nova-cell1-db-create-tf769\" (UID: \"be4e49d6-e019-40a7-873b-89956a1bf3c9\") " pod="openstack/nova-cell1-db-create-tf769" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.445889 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ttcj9" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.606840 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tf769" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.620651 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq5d2\" (UniqueName: \"kubernetes.io/projected/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-kube-api-access-fq5d2\") pod \"nova-cell1-8905-account-create-update-d7k5w\" (UID: \"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3\") " pod="openstack/nova-cell1-8905-account-create-update-d7k5w" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.620790 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-operator-scripts\") pod \"nova-cell1-8905-account-create-update-d7k5w\" (UID: \"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3\") " pod="openstack/nova-cell1-8905-account-create-update-d7k5w" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.623678 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.722567 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-operator-scripts\") pod \"nova-cell1-8905-account-create-update-d7k5w\" (UID: \"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3\") " pod="openstack/nova-cell1-8905-account-create-update-d7k5w" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.722726 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq5d2\" (UniqueName: \"kubernetes.io/projected/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-kube-api-access-fq5d2\") pod \"nova-cell1-8905-account-create-update-d7k5w\" (UID: \"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3\") " pod="openstack/nova-cell1-8905-account-create-update-d7k5w" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.723861 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-operator-scripts\") pod \"nova-cell1-8905-account-create-update-d7k5w\" (UID: \"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3\") " pod="openstack/nova-cell1-8905-account-create-update-d7k5w" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.745852 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq5d2\" (UniqueName: \"kubernetes.io/projected/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-kube-api-access-fq5d2\") pod \"nova-cell1-8905-account-create-update-d7k5w\" (UID: \"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3\") " pod="openstack/nova-cell1-8905-account-create-update-d7k5w" Dec 17 09:25:33 crc kubenswrapper[4935]: I1217 09:25:33.864103 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8905-account-create-update-d7k5w" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.134596 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.212027 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data-custom\") pod \"d273eade-a4a7-4f15-92e0-058d24689c84\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.212079 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-scripts\") pod \"d273eade-a4a7-4f15-92e0-058d24689c84\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.212117 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-combined-ca-bundle\") pod \"d273eade-a4a7-4f15-92e0-058d24689c84\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.212224 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data\") pod \"d273eade-a4a7-4f15-92e0-058d24689c84\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.212244 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wng86\" (UniqueName: \"kubernetes.io/projected/d273eade-a4a7-4f15-92e0-058d24689c84-kube-api-access-wng86\") pod \"d273eade-a4a7-4f15-92e0-058d24689c84\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.212318 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d273eade-a4a7-4f15-92e0-058d24689c84-etc-machine-id\") pod \"d273eade-a4a7-4f15-92e0-058d24689c84\" (UID: \"d273eade-a4a7-4f15-92e0-058d24689c84\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.212707 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d273eade-a4a7-4f15-92e0-058d24689c84-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d273eade-a4a7-4f15-92e0-058d24689c84" (UID: "d273eade-a4a7-4f15-92e0-058d24689c84"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.242134 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.242181 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d273eade-a4a7-4f15-92e0-058d24689c84" (UID: "d273eade-a4a7-4f15-92e0-058d24689c84"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.247927 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-scripts" (OuterVolumeSpecName: "scripts") pod "d273eade-a4a7-4f15-92e0-058d24689c84" (UID: "d273eade-a4a7-4f15-92e0-058d24689c84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.249896 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d273eade-a4a7-4f15-92e0-058d24689c84-kube-api-access-wng86" (OuterVolumeSpecName: "kube-api-access-wng86") pod "d273eade-a4a7-4f15-92e0-058d24689c84" (UID: "d273eade-a4a7-4f15-92e0-058d24689c84"). InnerVolumeSpecName "kube-api-access-wng86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.315812 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-config-data\") pod \"73e821de-bc88-4508-a699-85eac867742d\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.316424 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-log-httpd\") pod \"73e821de-bc88-4508-a699-85eac867742d\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.316501 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-sg-core-conf-yaml\") pod \"73e821de-bc88-4508-a699-85eac867742d\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.316639 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-combined-ca-bundle\") pod \"73e821de-bc88-4508-a699-85eac867742d\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.316707 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-run-httpd\") pod \"73e821de-bc88-4508-a699-85eac867742d\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.316781 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-scripts\") pod \"73e821de-bc88-4508-a699-85eac867742d\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.316820 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxdzv\" (UniqueName: \"kubernetes.io/projected/73e821de-bc88-4508-a699-85eac867742d-kube-api-access-sxdzv\") pod \"73e821de-bc88-4508-a699-85eac867742d\" (UID: \"73e821de-bc88-4508-a699-85eac867742d\") " Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.317501 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wng86\" (UniqueName: \"kubernetes.io/projected/d273eade-a4a7-4f15-92e0-058d24689c84-kube-api-access-wng86\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.317524 4935 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d273eade-a4a7-4f15-92e0-058d24689c84-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.317538 4935 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.317549 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.318198 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "73e821de-bc88-4508-a699-85eac867742d" (UID: "73e821de-bc88-4508-a699-85eac867742d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.318897 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "73e821de-bc88-4508-a699-85eac867742d" (UID: "73e821de-bc88-4508-a699-85eac867742d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.328771 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73e821de-bc88-4508-a699-85eac867742d-kube-api-access-sxdzv" (OuterVolumeSpecName: "kube-api-access-sxdzv") pod "73e821de-bc88-4508-a699-85eac867742d" (UID: "73e821de-bc88-4508-a699-85eac867742d"). InnerVolumeSpecName "kube-api-access-sxdzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.339757 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-scripts" (OuterVolumeSpecName: "scripts") pod "73e821de-bc88-4508-a699-85eac867742d" (UID: "73e821de-bc88-4508-a699-85eac867742d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.359166 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "73e821de-bc88-4508-a699-85eac867742d" (UID: "73e821de-bc88-4508-a699-85eac867742d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.365476 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d273eade-a4a7-4f15-92e0-058d24689c84" (UID: "d273eade-a4a7-4f15-92e0-058d24689c84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.418560 4935 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.418595 4935 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.418607 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.418616 4935 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/73e821de-bc88-4508-a699-85eac867742d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.418626 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.418637 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxdzv\" (UniqueName: \"kubernetes.io/projected/73e821de-bc88-4508-a699-85eac867742d-kube-api-access-sxdzv\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.435338 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data" (OuterVolumeSpecName: "config-data") pod "d273eade-a4a7-4f15-92e0-058d24689c84" (UID: "d273eade-a4a7-4f15-92e0-058d24689c84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.456314 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-config-data" (OuterVolumeSpecName: "config-data") pod "73e821de-bc88-4508-a699-85eac867742d" (UID: "73e821de-bc88-4508-a699-85eac867742d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.457419 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73e821de-bc88-4508-a699-85eac867742d" (UID: "73e821de-bc88-4508-a699-85eac867742d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.520452 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d273eade-a4a7-4f15-92e0-058d24689c84-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.520516 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.520529 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73e821de-bc88-4508-a699-85eac867742d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.610729 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ttcj9"] Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.685239 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k26rm"] Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.703700 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-fc5c-account-create-update-27xgk"] Dec 17 09:25:37 crc kubenswrapper[4935]: W1217 09:25:37.707880 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod987fbded_c214_4ec4_8a67_3c79ded79782.slice/crio-9bb8a9b653e913cd30fde60f13b66d449a13cbe0446a2e1c95c421fbfc12a6da WatchSource:0}: Error finding container 9bb8a9b653e913cd30fde60f13b66d449a13cbe0446a2e1c95c421fbfc12a6da: Status 404 returned error can't find the container with id 9bb8a9b653e913cd30fde60f13b66d449a13cbe0446a2e1c95c421fbfc12a6da Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.735382 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tf769"] Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.798240 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-767f646f97-whvbb"] Dec 17 09:25:37 crc kubenswrapper[4935]: W1217 09:25:37.910715 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1abecb40_40b9_4579_949a_40de63a9f65c.slice/crio-1e56028e0522783a01cc8445defd56cdca58d7908589fc666a23cdd8321275ae WatchSource:0}: Error finding container 1e56028e0522783a01cc8445defd56cdca58d7908589fc666a23cdd8321275ae: Status 404 returned error can't find the container with id 1e56028e0522783a01cc8445defd56cdca58d7908589fc666a23cdd8321275ae Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.910964 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-36ba-account-create-update-vkfrb"] Dec 17 09:25:37 crc kubenswrapper[4935]: I1217 09:25:37.931660 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8905-account-create-update-d7k5w"] Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.038433 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-36ba-account-create-update-vkfrb" event={"ID":"1abecb40-40b9-4579-949a-40de63a9f65c","Type":"ContainerStarted","Data":"1e56028e0522783a01cc8445defd56cdca58d7908589fc666a23cdd8321275ae"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.044168 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tf769" event={"ID":"be4e49d6-e019-40a7-873b-89956a1bf3c9","Type":"ContainerStarted","Data":"404547c63c5c04a33f1f53b7717f4afebbf5e1ce68a8925746f4bf1f7e1264e1"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.055884 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" event={"ID":"987fbded-c214-4ec4-8a67-3c79ded79782","Type":"ContainerStarted","Data":"9bb8a9b653e913cd30fde60f13b66d449a13cbe0446a2e1c95c421fbfc12a6da"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.059056 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k26rm" event={"ID":"47357892-adb6-48a7-99e9-b464b90d60db","Type":"ContainerStarted","Data":"ebdf63b78ba6084baa0109d5b8715b520a6c23c1a40cd74091e3ee52b8547571"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.062206 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0","Type":"ContainerStarted","Data":"435bb246ffeeb8f84bd74276dda459528f08c0ce8d6c6fe317d0a03617c0c437"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.074901 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ttcj9" event={"ID":"a3a20517-d9a8-4082-bd71-51790e3e4e89","Type":"ContainerStarted","Data":"dd4dfe6f58e97542bc6a1d0c20e379cc5ce81d4d175be957003516b8b29d3c17"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.074963 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ttcj9" event={"ID":"a3a20517-d9a8-4082-bd71-51790e3e4e89","Type":"ContainerStarted","Data":"b6e0fb3863345d701e85e315573071fc4b2f33b4c6323efe1ef75f9aafc77e59"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.082574 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8905-account-create-update-d7k5w" event={"ID":"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3","Type":"ContainerStarted","Data":"a4458388cfd16c9de3d95de08ce6b49b5cc99e20186f1011ba85866c5518f1a7"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.094266 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-k26rm" podStartSLOduration=6.094240787 podStartE2EDuration="6.094240787s" podCreationTimestamp="2025-12-17 09:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:25:38.089075601 +0000 UTC m=+1257.748916364" watchObservedRunningTime="2025-12-17 09:25:38.094240787 +0000 UTC m=+1257.754081550" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.100471 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"d273eade-a4a7-4f15-92e0-058d24689c84","Type":"ContainerDied","Data":"fe1af8815ce704333068574d6d6dbe8081db9c43846a76a1c757fc504d6c4821"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.100560 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.100566 4935 scope.go:117] "RemoveContainer" containerID="fd8e78f181102a463c35b87cad0742d06048e38d550bb0c211bfdb0079a94551" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.106373 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"73e821de-bc88-4508-a699-85eac867742d","Type":"ContainerDied","Data":"98c8c9fcaa55332ef3fe88d95bfc86074244f59f80b3d36c8ca107bcec4b14f9"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.106526 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.134154 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-ttcj9" podStartSLOduration=6.134121821 podStartE2EDuration="6.134121821s" podCreationTimestamp="2025-12-17 09:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:25:38.106395734 +0000 UTC m=+1257.766236497" watchObservedRunningTime="2025-12-17 09:25:38.134121821 +0000 UTC m=+1257.793962584" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.155308 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-767f646f97-whvbb" event={"ID":"562d41ed-7767-48a4-9cf4-84405e6deb48","Type":"ContainerStarted","Data":"f5e07448a404f80702b2832f76f945aea669b7656c5673eb96d2cc695da61f8a"} Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.201431 4935 scope.go:117] "RemoveContainer" containerID="8a03a966bd0672b4ed72604f5ea9e0ce6fa754b1b9e44becfc231ad8d3f9581b" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.209497 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.752563231 podStartE2EDuration="17.209470731s" podCreationTimestamp="2025-12-17 09:25:21 +0000 UTC" firstStartedPulling="2025-12-17 09:25:22.387915435 +0000 UTC m=+1242.047756198" lastFinishedPulling="2025-12-17 09:25:36.844822935 +0000 UTC m=+1256.504663698" observedRunningTime="2025-12-17 09:25:38.138487058 +0000 UTC m=+1257.798327821" watchObservedRunningTime="2025-12-17 09:25:38.209470731 +0000 UTC m=+1257.869311494" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.249188 4935 scope.go:117] "RemoveContainer" containerID="8a18f97061cd8c12313bc57a4887bfddf513e7a684a4420f301333b179398bd1" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.249906 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.257784 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.272488 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 17 09:25:38 crc kubenswrapper[4935]: E1217 09:25:38.272955 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="ceilometer-notification-agent" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.272973 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="ceilometer-notification-agent" Dec 17 09:25:38 crc kubenswrapper[4935]: E1217 09:25:38.272995 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="proxy-httpd" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.273003 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="proxy-httpd" Dec 17 09:25:38 crc kubenswrapper[4935]: E1217 09:25:38.273017 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d273eade-a4a7-4f15-92e0-058d24689c84" containerName="probe" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.273022 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d273eade-a4a7-4f15-92e0-058d24689c84" containerName="probe" Dec 17 09:25:38 crc kubenswrapper[4935]: E1217 09:25:38.273037 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d273eade-a4a7-4f15-92e0-058d24689c84" containerName="cinder-scheduler" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.273044 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d273eade-a4a7-4f15-92e0-058d24689c84" containerName="cinder-scheduler" Dec 17 09:25:38 crc kubenswrapper[4935]: E1217 09:25:38.273057 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="sg-core" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.273062 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="sg-core" Dec 17 09:25:38 crc kubenswrapper[4935]: E1217 09:25:38.273077 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="ceilometer-central-agent" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.273085 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="ceilometer-central-agent" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.273252 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="ceilometer-central-agent" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.273263 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="ceilometer-notification-agent" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.276717 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="sg-core" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.276752 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="d273eade-a4a7-4f15-92e0-058d24689c84" containerName="cinder-scheduler" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.276792 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="d273eade-a4a7-4f15-92e0-058d24689c84" containerName="probe" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.276802 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="73e821de-bc88-4508-a699-85eac867742d" containerName="proxy-httpd" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.278169 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.281413 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.311980 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.315745 4935 scope.go:117] "RemoveContainer" containerID="0acc78042528e26326a44346f2403ef147dcf641bfd0b5f2339cb6cc13083dd6" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.339918 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.363016 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.367421 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.367498 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fb81e79-940d-4ba5-a10d-c22dca5377e0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.367516 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-config-data\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.367548 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-scripts\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.367577 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.367636 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8m4hg\" (UniqueName: \"kubernetes.io/projected/7fb81e79-940d-4ba5-a10d-c22dca5377e0-kube-api-access-8m4hg\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.371161 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.374818 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.381758 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.382688 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.382724 4935 scope.go:117] "RemoveContainer" containerID="9715846b579f96180a870d2250f78e9a922a73afa7e198eb107b10e87e0e5317" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.382818 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469198 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-log-httpd\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469317 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8m4hg\" (UniqueName: \"kubernetes.io/projected/7fb81e79-940d-4ba5-a10d-c22dca5377e0-kube-api-access-8m4hg\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469348 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-scripts\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469453 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469508 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtzgg\" (UniqueName: \"kubernetes.io/projected/fc3bdffb-fb62-44b3-85a2-50126283d365-kube-api-access-vtzgg\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469531 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469587 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fb81e79-940d-4ba5-a10d-c22dca5377e0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469608 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-config-data\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469656 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469681 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-scripts\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469703 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-run-httpd\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469756 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-config-data\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.469786 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.472361 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7fb81e79-940d-4ba5-a10d-c22dca5377e0-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.475417 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.476942 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.479816 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-scripts\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.485864 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7fb81e79-940d-4ba5-a10d-c22dca5377e0-config-data\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.490915 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8m4hg\" (UniqueName: \"kubernetes.io/projected/7fb81e79-940d-4ba5-a10d-c22dca5377e0-kube-api-access-8m4hg\") pod \"cinder-scheduler-0\" (UID: \"7fb81e79-940d-4ba5-a10d-c22dca5377e0\") " pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.571372 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.571432 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-run-httpd\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.571477 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-config-data\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.571568 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-log-httpd\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.571596 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-scripts\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.571654 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.571692 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtzgg\" (UniqueName: \"kubernetes.io/projected/fc3bdffb-fb62-44b3-85a2-50126283d365-kube-api-access-vtzgg\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.572856 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-log-httpd\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.574323 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-run-httpd\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.577220 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-config-data\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.580183 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.581565 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-scripts\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.589142 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.594255 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtzgg\" (UniqueName: \"kubernetes.io/projected/fc3bdffb-fb62-44b3-85a2-50126283d365-kube-api-access-vtzgg\") pod \"ceilometer-0\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.627850 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.902470 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:38 crc kubenswrapper[4935]: I1217 09:25:38.910212 4935 scope.go:117] "RemoveContainer" containerID="679a1bc116b0cee3e8c4d47f888a549e89f4f4f4fad87e14fd583b341f37ce12" Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.063212 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54c44548bb-26j2l" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.138358 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73e821de-bc88-4508-a699-85eac867742d" path="/var/lib/kubelet/pods/73e821de-bc88-4508-a699-85eac867742d/volumes" Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.139846 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d273eade-a4a7-4f15-92e0-058d24689c84" path="/var/lib/kubelet/pods/d273eade-a4a7-4f15-92e0-058d24689c84/volumes" Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.184543 4935 generic.go:334] "Generic (PLEG): container finished" podID="1abecb40-40b9-4579-949a-40de63a9f65c" containerID="3b465b3f36b90a6aabc5fa090669a5cd9addbc15694b0ee9bb50dbc35ae9cd5d" exitCode=0 Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.184660 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-36ba-account-create-update-vkfrb" event={"ID":"1abecb40-40b9-4579-949a-40de63a9f65c","Type":"ContainerDied","Data":"3b465b3f36b90a6aabc5fa090669a5cd9addbc15694b0ee9bb50dbc35ae9cd5d"} Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.187847 4935 generic.go:334] "Generic (PLEG): container finished" podID="be4e49d6-e019-40a7-873b-89956a1bf3c9" containerID="baaf176e20cd084e202c11bb76c0a427e4171584299baafeb86a1f1bc4570c23" exitCode=0 Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.187896 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tf769" event={"ID":"be4e49d6-e019-40a7-873b-89956a1bf3c9","Type":"ContainerDied","Data":"baaf176e20cd084e202c11bb76c0a427e4171584299baafeb86a1f1bc4570c23"} Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.192296 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.200468 4935 generic.go:334] "Generic (PLEG): container finished" podID="987fbded-c214-4ec4-8a67-3c79ded79782" containerID="df23db94be5d90dbc57efa70742d40f9309536de3051e7e27800233e6f978215" exitCode=0 Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.200555 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" event={"ID":"987fbded-c214-4ec4-8a67-3c79ded79782","Type":"ContainerDied","Data":"df23db94be5d90dbc57efa70742d40f9309536de3051e7e27800233e6f978215"} Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.210261 4935 generic.go:334] "Generic (PLEG): container finished" podID="a3a20517-d9a8-4082-bd71-51790e3e4e89" containerID="dd4dfe6f58e97542bc6a1d0c20e379cc5ce81d4d175be957003516b8b29d3c17" exitCode=0 Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.210357 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ttcj9" event={"ID":"a3a20517-d9a8-4082-bd71-51790e3e4e89","Type":"ContainerDied","Data":"dd4dfe6f58e97542bc6a1d0c20e379cc5ce81d4d175be957003516b8b29d3c17"} Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.218146 4935 generic.go:334] "Generic (PLEG): container finished" podID="47357892-adb6-48a7-99e9-b464b90d60db" containerID="429cba53824dc60789d46872c8f1501dab831ba522d949c4155b2d8dfb4293dc" exitCode=0 Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.218255 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k26rm" event={"ID":"47357892-adb6-48a7-99e9-b464b90d60db","Type":"ContainerDied","Data":"429cba53824dc60789d46872c8f1501dab831ba522d949c4155b2d8dfb4293dc"} Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.220620 4935 generic.go:334] "Generic (PLEG): container finished" podID="7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3" containerID="6595c21fc067628d82349f91738809e6a18571ba8dd3b2825dc2e2abff0cadf6" exitCode=0 Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.220668 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8905-account-create-update-d7k5w" event={"ID":"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3","Type":"ContainerDied","Data":"6595c21fc067628d82349f91738809e6a18571ba8dd3b2825dc2e2abff0cadf6"} Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.231156 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-767f646f97-whvbb" event={"ID":"562d41ed-7767-48a4-9cf4-84405e6deb48","Type":"ContainerStarted","Data":"d25652d4e05f53534dea63cff02f66b8a8babb2393bee0812c776d7d5f254765"} Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.231199 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.231210 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-767f646f97-whvbb" event={"ID":"562d41ed-7767-48a4-9cf4-84405e6deb48","Type":"ContainerStarted","Data":"d27ef21bdc239f97987ca9d330301bad6efaa6f23d7239dc2e1af5e8da445bd3"} Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.231239 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.343965 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-767f646f97-whvbb" podStartSLOduration=11.343940736 podStartE2EDuration="11.343940736s" podCreationTimestamp="2025-12-17 09:25:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:25:39.314786575 +0000 UTC m=+1258.974627358" watchObservedRunningTime="2025-12-17 09:25:39.343940736 +0000 UTC m=+1259.003781499" Dec 17 09:25:39 crc kubenswrapper[4935]: I1217 09:25:39.498839 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:39 crc kubenswrapper[4935]: E1217 09:25:39.784657 4935 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8cca130e_5dbc_4edb_b0d9_f04a0ecd2ea6.slice/crio-conmon-60a63cdd5db0c8a1a4f78de46ef7939045b1c4415629aba4105851cdee5b0b57.scope\": RecentStats: unable to find data in memory cache]" Dec 17 09:25:40 crc kubenswrapper[4935]: I1217 09:25:40.318312 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3bdffb-fb62-44b3-85a2-50126283d365","Type":"ContainerStarted","Data":"c4d39c2191b296902e37744dd50e4914b275e5461ba12e2515baaa20cd4d4f78"} Dec 17 09:25:40 crc kubenswrapper[4935]: I1217 09:25:40.338740 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7fb81e79-940d-4ba5-a10d-c22dca5377e0","Type":"ContainerStarted","Data":"759d4b6ae3b84fd67610088a07b4148f9a384973ae39c03520dc666023364e90"} Dec 17 09:25:40 crc kubenswrapper[4935]: I1217 09:25:40.338793 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7fb81e79-940d-4ba5-a10d-c22dca5377e0","Type":"ContainerStarted","Data":"e86406eb2657bd2ba1c1603c30d2be3884675ba7933b9d1232f88b61154c1d65"} Dec 17 09:25:40 crc kubenswrapper[4935]: I1217 09:25:40.956857 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8905-account-create-update-d7k5w" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.149183 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-operator-scripts\") pod \"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3\" (UID: \"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.149516 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq5d2\" (UniqueName: \"kubernetes.io/projected/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-kube-api-access-fq5d2\") pod \"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3\" (UID: \"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.152559 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3" (UID: "7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.161413 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-kube-api-access-fq5d2" (OuterVolumeSpecName: "kube-api-access-fq5d2") pod "7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3" (UID: "7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3"). InnerVolumeSpecName "kube-api-access-fq5d2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.253950 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq5d2\" (UniqueName: \"kubernetes.io/projected/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-kube-api-access-fq5d2\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.253988 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.352841 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3bdffb-fb62-44b3-85a2-50126283d365","Type":"ContainerStarted","Data":"60f2443467ea79cf6cd31ace0b88a01c836398f31a831f443ebf60ee013be905"} Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.355258 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8905-account-create-update-d7k5w" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.355250 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8905-account-create-update-d7k5w" event={"ID":"7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3","Type":"ContainerDied","Data":"a4458388cfd16c9de3d95de08ce6b49b5cc99e20186f1011ba85866c5518f1a7"} Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.355375 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4458388cfd16c9de3d95de08ce6b49b5cc99e20186f1011ba85866c5518f1a7" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.557684 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.611362 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k26rm" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.625568 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-36ba-account-create-update-vkfrb" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.639659 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ttcj9" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.659648 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tf769" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.662071 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5w7z\" (UniqueName: \"kubernetes.io/projected/987fbded-c214-4ec4-8a67-3c79ded79782-kube-api-access-d5w7z\") pod \"987fbded-c214-4ec4-8a67-3c79ded79782\" (UID: \"987fbded-c214-4ec4-8a67-3c79ded79782\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.662125 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/987fbded-c214-4ec4-8a67-3c79ded79782-operator-scripts\") pod \"987fbded-c214-4ec4-8a67-3c79ded79782\" (UID: \"987fbded-c214-4ec4-8a67-3c79ded79782\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.662906 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/987fbded-c214-4ec4-8a67-3c79ded79782-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "987fbded-c214-4ec4-8a67-3c79ded79782" (UID: "987fbded-c214-4ec4-8a67-3c79ded79782"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.666504 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/987fbded-c214-4ec4-8a67-3c79ded79782-kube-api-access-d5w7z" (OuterVolumeSpecName: "kube-api-access-d5w7z") pod "987fbded-c214-4ec4-8a67-3c79ded79782" (UID: "987fbded-c214-4ec4-8a67-3c79ded79782"). InnerVolumeSpecName "kube-api-access-d5w7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.763932 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3a20517-d9a8-4082-bd71-51790e3e4e89-operator-scripts\") pod \"a3a20517-d9a8-4082-bd71-51790e3e4e89\" (UID: \"a3a20517-d9a8-4082-bd71-51790e3e4e89\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.764051 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47357892-adb6-48a7-99e9-b464b90d60db-operator-scripts\") pod \"47357892-adb6-48a7-99e9-b464b90d60db\" (UID: \"47357892-adb6-48a7-99e9-b464b90d60db\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.764248 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk27q\" (UniqueName: \"kubernetes.io/projected/47357892-adb6-48a7-99e9-b464b90d60db-kube-api-access-fk27q\") pod \"47357892-adb6-48a7-99e9-b464b90d60db\" (UID: \"47357892-adb6-48a7-99e9-b464b90d60db\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.764344 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrx9w\" (UniqueName: \"kubernetes.io/projected/be4e49d6-e019-40a7-873b-89956a1bf3c9-kube-api-access-xrx9w\") pod \"be4e49d6-e019-40a7-873b-89956a1bf3c9\" (UID: \"be4e49d6-e019-40a7-873b-89956a1bf3c9\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.764400 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abecb40-40b9-4579-949a-40de63a9f65c-operator-scripts\") pod \"1abecb40-40b9-4579-949a-40de63a9f65c\" (UID: \"1abecb40-40b9-4579-949a-40de63a9f65c\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.764461 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v24z4\" (UniqueName: \"kubernetes.io/projected/1abecb40-40b9-4579-949a-40de63a9f65c-kube-api-access-v24z4\") pod \"1abecb40-40b9-4579-949a-40de63a9f65c\" (UID: \"1abecb40-40b9-4579-949a-40de63a9f65c\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.764497 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pghv\" (UniqueName: \"kubernetes.io/projected/a3a20517-d9a8-4082-bd71-51790e3e4e89-kube-api-access-4pghv\") pod \"a3a20517-d9a8-4082-bd71-51790e3e4e89\" (UID: \"a3a20517-d9a8-4082-bd71-51790e3e4e89\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.764603 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4e49d6-e019-40a7-873b-89956a1bf3c9-operator-scripts\") pod \"be4e49d6-e019-40a7-873b-89956a1bf3c9\" (UID: \"be4e49d6-e019-40a7-873b-89956a1bf3c9\") " Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.765101 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5w7z\" (UniqueName: \"kubernetes.io/projected/987fbded-c214-4ec4-8a67-3c79ded79782-kube-api-access-d5w7z\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.765120 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/987fbded-c214-4ec4-8a67-3c79ded79782-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.764528 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a20517-d9a8-4082-bd71-51790e3e4e89-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3a20517-d9a8-4082-bd71-51790e3e4e89" (UID: "a3a20517-d9a8-4082-bd71-51790e3e4e89"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.764612 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47357892-adb6-48a7-99e9-b464b90d60db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47357892-adb6-48a7-99e9-b464b90d60db" (UID: "47357892-adb6-48a7-99e9-b464b90d60db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.765590 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be4e49d6-e019-40a7-873b-89956a1bf3c9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be4e49d6-e019-40a7-873b-89956a1bf3c9" (UID: "be4e49d6-e019-40a7-873b-89956a1bf3c9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.766476 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1abecb40-40b9-4579-949a-40de63a9f65c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1abecb40-40b9-4579-949a-40de63a9f65c" (UID: "1abecb40-40b9-4579-949a-40de63a9f65c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.768293 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47357892-adb6-48a7-99e9-b464b90d60db-kube-api-access-fk27q" (OuterVolumeSpecName: "kube-api-access-fk27q") pod "47357892-adb6-48a7-99e9-b464b90d60db" (UID: "47357892-adb6-48a7-99e9-b464b90d60db"). InnerVolumeSpecName "kube-api-access-fk27q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.770089 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4e49d6-e019-40a7-873b-89956a1bf3c9-kube-api-access-xrx9w" (OuterVolumeSpecName: "kube-api-access-xrx9w") pod "be4e49d6-e019-40a7-873b-89956a1bf3c9" (UID: "be4e49d6-e019-40a7-873b-89956a1bf3c9"). InnerVolumeSpecName "kube-api-access-xrx9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.770907 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a20517-d9a8-4082-bd71-51790e3e4e89-kube-api-access-4pghv" (OuterVolumeSpecName: "kube-api-access-4pghv") pod "a3a20517-d9a8-4082-bd71-51790e3e4e89" (UID: "a3a20517-d9a8-4082-bd71-51790e3e4e89"). InnerVolumeSpecName "kube-api-access-4pghv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.772736 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abecb40-40b9-4579-949a-40de63a9f65c-kube-api-access-v24z4" (OuterVolumeSpecName: "kube-api-access-v24z4") pod "1abecb40-40b9-4579-949a-40de63a9f65c" (UID: "1abecb40-40b9-4579-949a-40de63a9f65c"). InnerVolumeSpecName "kube-api-access-v24z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.866922 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be4e49d6-e019-40a7-873b-89956a1bf3c9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.866973 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3a20517-d9a8-4082-bd71-51790e3e4e89-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.866983 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47357892-adb6-48a7-99e9-b464b90d60db-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.866992 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk27q\" (UniqueName: \"kubernetes.io/projected/47357892-adb6-48a7-99e9-b464b90d60db-kube-api-access-fk27q\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.868993 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrx9w\" (UniqueName: \"kubernetes.io/projected/be4e49d6-e019-40a7-873b-89956a1bf3c9-kube-api-access-xrx9w\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.869017 4935 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1abecb40-40b9-4579-949a-40de63a9f65c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.869028 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v24z4\" (UniqueName: \"kubernetes.io/projected/1abecb40-40b9-4579-949a-40de63a9f65c-kube-api-access-v24z4\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:41 crc kubenswrapper[4935]: I1217 09:25:41.869036 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pghv\" (UniqueName: \"kubernetes.io/projected/a3a20517-d9a8-4082-bd71-51790e3e4e89-kube-api-access-4pghv\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.369805 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ttcj9" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.369944 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ttcj9" event={"ID":"a3a20517-d9a8-4082-bd71-51790e3e4e89","Type":"ContainerDied","Data":"b6e0fb3863345d701e85e315573071fc4b2f33b4c6323efe1ef75f9aafc77e59"} Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.370259 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6e0fb3863345d701e85e315573071fc4b2f33b4c6323efe1ef75f9aafc77e59" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.375493 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" event={"ID":"987fbded-c214-4ec4-8a67-3c79ded79782","Type":"ContainerDied","Data":"9bb8a9b653e913cd30fde60f13b66d449a13cbe0446a2e1c95c421fbfc12a6da"} Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.375569 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bb8a9b653e913cd30fde60f13b66d449a13cbe0446a2e1c95c421fbfc12a6da" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.375703 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-fc5c-account-create-update-27xgk" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.384306 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-36ba-account-create-update-vkfrb" event={"ID":"1abecb40-40b9-4579-949a-40de63a9f65c","Type":"ContainerDied","Data":"1e56028e0522783a01cc8445defd56cdca58d7908589fc666a23cdd8321275ae"} Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.384367 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e56028e0522783a01cc8445defd56cdca58d7908589fc666a23cdd8321275ae" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.384322 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-36ba-account-create-update-vkfrb" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.395134 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k26rm" event={"ID":"47357892-adb6-48a7-99e9-b464b90d60db","Type":"ContainerDied","Data":"ebdf63b78ba6084baa0109d5b8715b520a6c23c1a40cd74091e3ee52b8547571"} Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.395185 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebdf63b78ba6084baa0109d5b8715b520a6c23c1a40cd74091e3ee52b8547571" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.395617 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k26rm" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.408844 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3bdffb-fb62-44b3-85a2-50126283d365","Type":"ContainerStarted","Data":"1a4df0611eeb910f8bd70c20df59b78508108ad5f7d8bdc34c861d01611c78d9"} Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.427978 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7fb81e79-940d-4ba5-a10d-c22dca5377e0","Type":"ContainerStarted","Data":"b2e77e69f7feef3ae68ffca2199f94be5b34831cf08296a98372989fc935ec9a"} Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.440294 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tf769" event={"ID":"be4e49d6-e019-40a7-873b-89956a1bf3c9","Type":"ContainerDied","Data":"404547c63c5c04a33f1f53b7717f4afebbf5e1ce68a8925746f4bf1f7e1264e1"} Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.440341 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="404547c63c5c04a33f1f53b7717f4afebbf5e1ce68a8925746f4bf1f7e1264e1" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.440432 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tf769" Dec 17 09:25:42 crc kubenswrapper[4935]: I1217 09:25:42.461734 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.461707617 podStartE2EDuration="4.461707617s" podCreationTimestamp="2025-12-17 09:25:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:25:42.454609894 +0000 UTC m=+1262.114450657" watchObservedRunningTime="2025-12-17 09:25:42.461707617 +0000 UTC m=+1262.121548390" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.470028 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3bdffb-fb62-44b3-85a2-50126283d365","Type":"ContainerStarted","Data":"bb3a316c546d57dfa590b5287064fe9b5334c438aab9482bd0aa04c4fe7ad38c"} Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.624405 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hk94m"] Dec 17 09:25:43 crc kubenswrapper[4935]: E1217 09:25:43.624870 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3" containerName="mariadb-account-create-update" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.624888 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3" containerName="mariadb-account-create-update" Dec 17 09:25:43 crc kubenswrapper[4935]: E1217 09:25:43.624898 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abecb40-40b9-4579-949a-40de63a9f65c" containerName="mariadb-account-create-update" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.624906 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abecb40-40b9-4579-949a-40de63a9f65c" containerName="mariadb-account-create-update" Dec 17 09:25:43 crc kubenswrapper[4935]: E1217 09:25:43.624930 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4e49d6-e019-40a7-873b-89956a1bf3c9" containerName="mariadb-database-create" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.624938 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4e49d6-e019-40a7-873b-89956a1bf3c9" containerName="mariadb-database-create" Dec 17 09:25:43 crc kubenswrapper[4935]: E1217 09:25:43.624955 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="987fbded-c214-4ec4-8a67-3c79ded79782" containerName="mariadb-account-create-update" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.624967 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="987fbded-c214-4ec4-8a67-3c79ded79782" containerName="mariadb-account-create-update" Dec 17 09:25:43 crc kubenswrapper[4935]: E1217 09:25:43.624998 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a20517-d9a8-4082-bd71-51790e3e4e89" containerName="mariadb-database-create" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.625006 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a20517-d9a8-4082-bd71-51790e3e4e89" containerName="mariadb-database-create" Dec 17 09:25:43 crc kubenswrapper[4935]: E1217 09:25:43.625021 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47357892-adb6-48a7-99e9-b464b90d60db" containerName="mariadb-database-create" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.625028 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="47357892-adb6-48a7-99e9-b464b90d60db" containerName="mariadb-database-create" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.625213 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4e49d6-e019-40a7-873b-89956a1bf3c9" containerName="mariadb-database-create" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.625232 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abecb40-40b9-4579-949a-40de63a9f65c" containerName="mariadb-account-create-update" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.625245 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="47357892-adb6-48a7-99e9-b464b90d60db" containerName="mariadb-database-create" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.625258 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="987fbded-c214-4ec4-8a67-3c79ded79782" containerName="mariadb-account-create-update" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.625265 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3" containerName="mariadb-account-create-update" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.627045 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3a20517-d9a8-4082-bd71-51790e3e4e89" containerName="mariadb-database-create" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.627773 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.628944 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.631199 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q2l67" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.631566 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.633068 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.640608 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hk94m"] Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.722443 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw99p\" (UniqueName: \"kubernetes.io/projected/e9add23c-5f09-479c-94f8-4e4c35af6dde-kube-api-access-tw99p\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.724459 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.724705 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-config-data\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.724838 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-scripts\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.826503 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.826597 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-config-data\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.826650 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-scripts\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.826696 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw99p\" (UniqueName: \"kubernetes.io/projected/e9add23c-5f09-479c-94f8-4e4c35af6dde-kube-api-access-tw99p\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.834782 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.839370 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-config-data\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.842726 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-scripts\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.853976 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw99p\" (UniqueName: \"kubernetes.io/projected/e9add23c-5f09-479c-94f8-4e4c35af6dde-kube-api-access-tw99p\") pod \"nova-cell0-conductor-db-sync-hk94m\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:43 crc kubenswrapper[4935]: I1217 09:25:43.948011 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:25:44 crc kubenswrapper[4935]: I1217 09:25:44.326073 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:44 crc kubenswrapper[4935]: I1217 09:25:44.326643 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-767f646f97-whvbb" Dec 17 09:25:44 crc kubenswrapper[4935]: I1217 09:25:44.488769 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hk94m"] Dec 17 09:25:45 crc kubenswrapper[4935]: I1217 09:25:45.510738 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3bdffb-fb62-44b3-85a2-50126283d365","Type":"ContainerStarted","Data":"ee366bc255d7d29630086b2d3e47320f0a5c5fe5b8f7f2e57a884327253bec87"} Dec 17 09:25:45 crc kubenswrapper[4935]: I1217 09:25:45.512541 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 17 09:25:45 crc kubenswrapper[4935]: I1217 09:25:45.513959 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hk94m" event={"ID":"e9add23c-5f09-479c-94f8-4e4c35af6dde","Type":"ContainerStarted","Data":"a669e9e7b49a1f654769027cbdd7934275ebc0017e57b221d34e6eb8eab5b4dd"} Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.447737 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.481991 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-secret-key\") pod \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.482067 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-scripts\") pod \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.482136 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5385d045-3f7c-447d-8ce8-d12a8de0cdce-logs\") pod \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.482266 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-config-data\") pod \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.482346 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-tls-certs\") pod \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.482402 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tf48\" (UniqueName: \"kubernetes.io/projected/5385d045-3f7c-447d-8ce8-d12a8de0cdce-kube-api-access-4tf48\") pod \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.482532 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-combined-ca-bundle\") pod \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\" (UID: \"5385d045-3f7c-447d-8ce8-d12a8de0cdce\") " Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.483675 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5385d045-3f7c-447d-8ce8-d12a8de0cdce-logs" (OuterVolumeSpecName: "logs") pod "5385d045-3f7c-447d-8ce8-d12a8de0cdce" (UID: "5385d045-3f7c-447d-8ce8-d12a8de0cdce"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.483669 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.583647886 podStartE2EDuration="8.48364663s" podCreationTimestamp="2025-12-17 09:25:38 +0000 UTC" firstStartedPulling="2025-12-17 09:25:39.523472542 +0000 UTC m=+1259.183313305" lastFinishedPulling="2025-12-17 09:25:44.423471276 +0000 UTC m=+1264.083312049" observedRunningTime="2025-12-17 09:25:45.550551471 +0000 UTC m=+1265.210392234" watchObservedRunningTime="2025-12-17 09:25:46.48364663 +0000 UTC m=+1266.143487393" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.522297 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5385d045-3f7c-447d-8ce8-d12a8de0cdce-kube-api-access-4tf48" (OuterVolumeSpecName: "kube-api-access-4tf48") pod "5385d045-3f7c-447d-8ce8-d12a8de0cdce" (UID: "5385d045-3f7c-447d-8ce8-d12a8de0cdce"). InnerVolumeSpecName "kube-api-access-4tf48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.524930 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-scripts" (OuterVolumeSpecName: "scripts") pod "5385d045-3f7c-447d-8ce8-d12a8de0cdce" (UID: "5385d045-3f7c-447d-8ce8-d12a8de0cdce"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.530557 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5385d045-3f7c-447d-8ce8-d12a8de0cdce" (UID: "5385d045-3f7c-447d-8ce8-d12a8de0cdce"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.539646 4935 generic.go:334] "Generic (PLEG): container finished" podID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerID="26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7" exitCode=137 Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.541864 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54c44548bb-26j2l" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.542431 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c44548bb-26j2l" event={"ID":"5385d045-3f7c-447d-8ce8-d12a8de0cdce","Type":"ContainerDied","Data":"26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7"} Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.542464 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54c44548bb-26j2l" event={"ID":"5385d045-3f7c-447d-8ce8-d12a8de0cdce","Type":"ContainerDied","Data":"b74215db1bd1dd575fe06e49b4959bde1cde54925c011c89f26e1453d3916289"} Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.542482 4935 scope.go:117] "RemoveContainer" containerID="0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.561387 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5385d045-3f7c-447d-8ce8-d12a8de0cdce" (UID: "5385d045-3f7c-447d-8ce8-d12a8de0cdce"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.563366 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-config-data" (OuterVolumeSpecName: "config-data") pod "5385d045-3f7c-447d-8ce8-d12a8de0cdce" (UID: "5385d045-3f7c-447d-8ce8-d12a8de0cdce"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.572892 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "5385d045-3f7c-447d-8ce8-d12a8de0cdce" (UID: "5385d045-3f7c-447d-8ce8-d12a8de0cdce"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.584942 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.585216 4935 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.585316 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.585391 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5385d045-3f7c-447d-8ce8-d12a8de0cdce-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.585447 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5385d045-3f7c-447d-8ce8-d12a8de0cdce-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.585498 4935 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5385d045-3f7c-447d-8ce8-d12a8de0cdce-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.585560 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tf48\" (UniqueName: \"kubernetes.io/projected/5385d045-3f7c-447d-8ce8-d12a8de0cdce-kube-api-access-4tf48\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.845051 4935 scope.go:117] "RemoveContainer" containerID="26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.872610 4935 scope.go:117] "RemoveContainer" containerID="0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1" Dec 17 09:25:46 crc kubenswrapper[4935]: E1217 09:25:46.877709 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1\": container with ID starting with 0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1 not found: ID does not exist" containerID="0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.877790 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1"} err="failed to get container status \"0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1\": rpc error: code = NotFound desc = could not find container \"0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1\": container with ID starting with 0dddfaec4e968597a481995e27c2a57f3d0b52b12c1f61d20955d8447d84c8f1 not found: ID does not exist" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.877850 4935 scope.go:117] "RemoveContainer" containerID="26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.880080 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54c44548bb-26j2l"] Dec 17 09:25:46 crc kubenswrapper[4935]: E1217 09:25:46.880411 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7\": container with ID starting with 26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7 not found: ID does not exist" containerID="26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.880551 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7"} err="failed to get container status \"26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7\": rpc error: code = NotFound desc = could not find container \"26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7\": container with ID starting with 26a443b6c6002f41051ab1d32cc2f8ef7e29d96b6fee878fd2ea7a2ed4b58aa7 not found: ID does not exist" Dec 17 09:25:46 crc kubenswrapper[4935]: I1217 09:25:46.897264 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54c44548bb-26j2l"] Dec 17 09:25:47 crc kubenswrapper[4935]: I1217 09:25:47.150768 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" path="/var/lib/kubelet/pods/5385d045-3f7c-447d-8ce8-d12a8de0cdce/volumes" Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.069477 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.069777 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="ceilometer-central-agent" containerID="cri-o://60f2443467ea79cf6cd31ace0b88a01c836398f31a831f443ebf60ee013be905" gracePeriod=30 Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.069937 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="proxy-httpd" containerID="cri-o://ee366bc255d7d29630086b2d3e47320f0a5c5fe5b8f7f2e57a884327253bec87" gracePeriod=30 Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.069977 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="sg-core" containerID="cri-o://bb3a316c546d57dfa590b5287064fe9b5334c438aab9482bd0aa04c4fe7ad38c" gracePeriod=30 Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.070010 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="ceilometer-notification-agent" containerID="cri-o://1a4df0611eeb910f8bd70c20df59b78508108ad5f7d8bdc34c861d01611c78d9" gracePeriod=30 Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.570636 4935 generic.go:334] "Generic (PLEG): container finished" podID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerID="ee366bc255d7d29630086b2d3e47320f0a5c5fe5b8f7f2e57a884327253bec87" exitCode=0 Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.570687 4935 generic.go:334] "Generic (PLEG): container finished" podID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerID="bb3a316c546d57dfa590b5287064fe9b5334c438aab9482bd0aa04c4fe7ad38c" exitCode=2 Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.570697 4935 generic.go:334] "Generic (PLEG): container finished" podID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerID="1a4df0611eeb910f8bd70c20df59b78508108ad5f7d8bdc34c861d01611c78d9" exitCode=0 Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.570707 4935 generic.go:334] "Generic (PLEG): container finished" podID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerID="60f2443467ea79cf6cd31ace0b88a01c836398f31a831f443ebf60ee013be905" exitCode=0 Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.570734 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3bdffb-fb62-44b3-85a2-50126283d365","Type":"ContainerDied","Data":"ee366bc255d7d29630086b2d3e47320f0a5c5fe5b8f7f2e57a884327253bec87"} Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.570769 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3bdffb-fb62-44b3-85a2-50126283d365","Type":"ContainerDied","Data":"bb3a316c546d57dfa590b5287064fe9b5334c438aab9482bd0aa04c4fe7ad38c"} Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.570786 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3bdffb-fb62-44b3-85a2-50126283d365","Type":"ContainerDied","Data":"1a4df0611eeb910f8bd70c20df59b78508108ad5f7d8bdc34c861d01611c78d9"} Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.570798 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3bdffb-fb62-44b3-85a2-50126283d365","Type":"ContainerDied","Data":"60f2443467ea79cf6cd31ace0b88a01c836398f31a831f443ebf60ee013be905"} Dec 17 09:25:48 crc kubenswrapper[4935]: I1217 09:25:48.927588 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.753209 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.876025 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-config-data\") pod \"fc3bdffb-fb62-44b3-85a2-50126283d365\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.876452 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-run-httpd\") pod \"fc3bdffb-fb62-44b3-85a2-50126283d365\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.876559 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-combined-ca-bundle\") pod \"fc3bdffb-fb62-44b3-85a2-50126283d365\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.876651 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-scripts\") pod \"fc3bdffb-fb62-44b3-85a2-50126283d365\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.876828 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-sg-core-conf-yaml\") pod \"fc3bdffb-fb62-44b3-85a2-50126283d365\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.876859 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-log-httpd\") pod \"fc3bdffb-fb62-44b3-85a2-50126283d365\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.876887 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtzgg\" (UniqueName: \"kubernetes.io/projected/fc3bdffb-fb62-44b3-85a2-50126283d365-kube-api-access-vtzgg\") pod \"fc3bdffb-fb62-44b3-85a2-50126283d365\" (UID: \"fc3bdffb-fb62-44b3-85a2-50126283d365\") " Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.881965 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fc3bdffb-fb62-44b3-85a2-50126283d365" (UID: "fc3bdffb-fb62-44b3-85a2-50126283d365"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.882777 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fc3bdffb-fb62-44b3-85a2-50126283d365" (UID: "fc3bdffb-fb62-44b3-85a2-50126283d365"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.885544 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3bdffb-fb62-44b3-85a2-50126283d365-kube-api-access-vtzgg" (OuterVolumeSpecName: "kube-api-access-vtzgg") pod "fc3bdffb-fb62-44b3-85a2-50126283d365" (UID: "fc3bdffb-fb62-44b3-85a2-50126283d365"). InnerVolumeSpecName "kube-api-access-vtzgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.886489 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-scripts" (OuterVolumeSpecName: "scripts") pod "fc3bdffb-fb62-44b3-85a2-50126283d365" (UID: "fc3bdffb-fb62-44b3-85a2-50126283d365"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.920116 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fc3bdffb-fb62-44b3-85a2-50126283d365" (UID: "fc3bdffb-fb62-44b3-85a2-50126283d365"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.978553 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc3bdffb-fb62-44b3-85a2-50126283d365" (UID: "fc3bdffb-fb62-44b3-85a2-50126283d365"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.980702 4935 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.980763 4935 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.980783 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtzgg\" (UniqueName: \"kubernetes.io/projected/fc3bdffb-fb62-44b3-85a2-50126283d365-kube-api-access-vtzgg\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.980802 4935 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3bdffb-fb62-44b3-85a2-50126283d365-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.980814 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:53 crc kubenswrapper[4935]: I1217 09:25:53.980827 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.008428 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-config-data" (OuterVolumeSpecName: "config-data") pod "fc3bdffb-fb62-44b3-85a2-50126283d365" (UID: "fc3bdffb-fb62-44b3-85a2-50126283d365"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.083228 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3bdffb-fb62-44b3-85a2-50126283d365-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.639786 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3bdffb-fb62-44b3-85a2-50126283d365","Type":"ContainerDied","Data":"c4d39c2191b296902e37744dd50e4914b275e5461ba12e2515baaa20cd4d4f78"} Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.640231 4935 scope.go:117] "RemoveContainer" containerID="ee366bc255d7d29630086b2d3e47320f0a5c5fe5b8f7f2e57a884327253bec87" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.639856 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.642133 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hk94m" event={"ID":"e9add23c-5f09-479c-94f8-4e4c35af6dde","Type":"ContainerStarted","Data":"6da47cc7beed25361c935a9c4b27ad95a0c65cf8a0ff3a41da9dc1d4f70418a7"} Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.674012 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-hk94m" podStartSLOduration=2.672267354 podStartE2EDuration="11.67399502s" podCreationTimestamp="2025-12-17 09:25:43 +0000 UTC" firstStartedPulling="2025-12-17 09:25:44.51489263 +0000 UTC m=+1264.174733393" lastFinishedPulling="2025-12-17 09:25:53.516620296 +0000 UTC m=+1273.176461059" observedRunningTime="2025-12-17 09:25:54.672182036 +0000 UTC m=+1274.332022799" watchObservedRunningTime="2025-12-17 09:25:54.67399502 +0000 UTC m=+1274.333835783" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.690870 4935 scope.go:117] "RemoveContainer" containerID="bb3a316c546d57dfa590b5287064fe9b5334c438aab9482bd0aa04c4fe7ad38c" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.698128 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.706376 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.713244 4935 scope.go:117] "RemoveContainer" containerID="1a4df0611eeb910f8bd70c20df59b78508108ad5f7d8bdc34c861d01611c78d9" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.730719 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:54 crc kubenswrapper[4935]: E1217 09:25:54.731343 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="ceilometer-central-agent" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731381 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="ceilometer-central-agent" Dec 17 09:25:54 crc kubenswrapper[4935]: E1217 09:25:54.731403 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="sg-core" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731410 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="sg-core" Dec 17 09:25:54 crc kubenswrapper[4935]: E1217 09:25:54.731427 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon-log" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731437 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon-log" Dec 17 09:25:54 crc kubenswrapper[4935]: E1217 09:25:54.731471 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="ceilometer-notification-agent" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731479 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="ceilometer-notification-agent" Dec 17 09:25:54 crc kubenswrapper[4935]: E1217 09:25:54.731499 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="proxy-httpd" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731506 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="proxy-httpd" Dec 17 09:25:54 crc kubenswrapper[4935]: E1217 09:25:54.731527 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731534 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731820 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="ceilometer-central-agent" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731840 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="sg-core" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731877 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="ceilometer-notification-agent" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731893 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon-log" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731917 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5385d045-3f7c-447d-8ce8-d12a8de0cdce" containerName="horizon" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.731961 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" containerName="proxy-httpd" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.734703 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.739791 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.747353 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.747694 4935 scope.go:117] "RemoveContainer" containerID="60f2443467ea79cf6cd31ace0b88a01c836398f31a831f443ebf60ee013be905" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.757713 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.795328 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-config-data\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.795456 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.795499 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-log-httpd\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.795523 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-run-httpd\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.795585 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-scripts\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.795624 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.795876 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2vcb\" (UniqueName: \"kubernetes.io/projected/3ca17879-b1f6-4a4b-84e4-94e44e79064e-kube-api-access-p2vcb\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.897994 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-log-httpd\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.898066 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-run-httpd\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.898122 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-scripts\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.898203 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.898861 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-log-httpd\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.899026 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-run-httpd\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.899549 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2vcb\" (UniqueName: \"kubernetes.io/projected/3ca17879-b1f6-4a4b-84e4-94e44e79064e-kube-api-access-p2vcb\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.899713 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-config-data\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.899803 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.906227 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-scripts\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.917103 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.918102 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-config-data\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.918488 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:54 crc kubenswrapper[4935]: I1217 09:25:54.929456 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2vcb\" (UniqueName: \"kubernetes.io/projected/3ca17879-b1f6-4a4b-84e4-94e44e79064e-kube-api-access-p2vcb\") pod \"ceilometer-0\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " pod="openstack/ceilometer-0" Dec 17 09:25:55 crc kubenswrapper[4935]: I1217 09:25:55.054482 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:25:55 crc kubenswrapper[4935]: I1217 09:25:55.137281 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3bdffb-fb62-44b3-85a2-50126283d365" path="/var/lib/kubelet/pods/fc3bdffb-fb62-44b3-85a2-50126283d365/volumes" Dec 17 09:25:55 crc kubenswrapper[4935]: I1217 09:25:55.523337 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:25:55 crc kubenswrapper[4935]: W1217 09:25:55.528076 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ca17879_b1f6_4a4b_84e4_94e44e79064e.slice/crio-46163139fa5bd609af098d83f88d661e59a8066f1e3109d4dfa754a1ef6b481e WatchSource:0}: Error finding container 46163139fa5bd609af098d83f88d661e59a8066f1e3109d4dfa754a1ef6b481e: Status 404 returned error can't find the container with id 46163139fa5bd609af098d83f88d661e59a8066f1e3109d4dfa754a1ef6b481e Dec 17 09:25:55 crc kubenswrapper[4935]: I1217 09:25:55.652995 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ca17879-b1f6-4a4b-84e4-94e44e79064e","Type":"ContainerStarted","Data":"46163139fa5bd609af098d83f88d661e59a8066f1e3109d4dfa754a1ef6b481e"} Dec 17 09:25:57 crc kubenswrapper[4935]: I1217 09:25:57.674229 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ca17879-b1f6-4a4b-84e4-94e44e79064e","Type":"ContainerStarted","Data":"719ac1c2785181fc1cd86816cf2dfee2ce430d929efae8f5926f6799c86b3d2f"} Dec 17 09:25:57 crc kubenswrapper[4935]: I1217 09:25:57.675184 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ca17879-b1f6-4a4b-84e4-94e44e79064e","Type":"ContainerStarted","Data":"4041f8cace17012f7f4c66cb3e0ee5ea722d274bd653c1f0a2bb28dd153aeb28"} Dec 17 09:25:58 crc kubenswrapper[4935]: I1217 09:25:58.687364 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ca17879-b1f6-4a4b-84e4-94e44e79064e","Type":"ContainerStarted","Data":"93154e65932f67853ef03c878ca2049e0c30a893ac93207360598fc529f11812"} Dec 17 09:26:01 crc kubenswrapper[4935]: I1217 09:26:01.715883 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ca17879-b1f6-4a4b-84e4-94e44e79064e","Type":"ContainerStarted","Data":"9846a24b830a6d783c034c0c4e309c0ad699262f7382a3cebd488956869c13f8"} Dec 17 09:26:01 crc kubenswrapper[4935]: I1217 09:26:01.718086 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 17 09:26:01 crc kubenswrapper[4935]: I1217 09:26:01.747114 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.92241844 podStartE2EDuration="7.747096956s" podCreationTimestamp="2025-12-17 09:25:54 +0000 UTC" firstStartedPulling="2025-12-17 09:25:55.531683296 +0000 UTC m=+1275.191524059" lastFinishedPulling="2025-12-17 09:26:00.356361812 +0000 UTC m=+1280.016202575" observedRunningTime="2025-12-17 09:26:01.746787239 +0000 UTC m=+1281.406628002" watchObservedRunningTime="2025-12-17 09:26:01.747096956 +0000 UTC m=+1281.406937719" Dec 17 09:26:02 crc kubenswrapper[4935]: I1217 09:26:02.550396 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:03 crc kubenswrapper[4935]: I1217 09:26:03.737949 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="ceilometer-central-agent" containerID="cri-o://4041f8cace17012f7f4c66cb3e0ee5ea722d274bd653c1f0a2bb28dd153aeb28" gracePeriod=30 Dec 17 09:26:03 crc kubenswrapper[4935]: I1217 09:26:03.738092 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="proxy-httpd" containerID="cri-o://9846a24b830a6d783c034c0c4e309c0ad699262f7382a3cebd488956869c13f8" gracePeriod=30 Dec 17 09:26:03 crc kubenswrapper[4935]: I1217 09:26:03.738087 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="ceilometer-notification-agent" containerID="cri-o://719ac1c2785181fc1cd86816cf2dfee2ce430d929efae8f5926f6799c86b3d2f" gracePeriod=30 Dec 17 09:26:03 crc kubenswrapper[4935]: I1217 09:26:03.738635 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="sg-core" containerID="cri-o://93154e65932f67853ef03c878ca2049e0c30a893ac93207360598fc529f11812" gracePeriod=30 Dec 17 09:26:04 crc kubenswrapper[4935]: I1217 09:26:04.747854 4935 generic.go:334] "Generic (PLEG): container finished" podID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerID="9846a24b830a6d783c034c0c4e309c0ad699262f7382a3cebd488956869c13f8" exitCode=0 Dec 17 09:26:04 crc kubenswrapper[4935]: I1217 09:26:04.748505 4935 generic.go:334] "Generic (PLEG): container finished" podID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerID="93154e65932f67853ef03c878ca2049e0c30a893ac93207360598fc529f11812" exitCode=2 Dec 17 09:26:04 crc kubenswrapper[4935]: I1217 09:26:04.748515 4935 generic.go:334] "Generic (PLEG): container finished" podID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerID="719ac1c2785181fc1cd86816cf2dfee2ce430d929efae8f5926f6799c86b3d2f" exitCode=0 Dec 17 09:26:04 crc kubenswrapper[4935]: I1217 09:26:04.748543 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ca17879-b1f6-4a4b-84e4-94e44e79064e","Type":"ContainerDied","Data":"9846a24b830a6d783c034c0c4e309c0ad699262f7382a3cebd488956869c13f8"} Dec 17 09:26:04 crc kubenswrapper[4935]: I1217 09:26:04.748576 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ca17879-b1f6-4a4b-84e4-94e44e79064e","Type":"ContainerDied","Data":"93154e65932f67853ef03c878ca2049e0c30a893ac93207360598fc529f11812"} Dec 17 09:26:04 crc kubenswrapper[4935]: I1217 09:26:04.748588 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ca17879-b1f6-4a4b-84e4-94e44e79064e","Type":"ContainerDied","Data":"719ac1c2785181fc1cd86816cf2dfee2ce430d929efae8f5926f6799c86b3d2f"} Dec 17 09:26:07 crc kubenswrapper[4935]: I1217 09:26:07.782853 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:26:07 crc kubenswrapper[4935]: I1217 09:26:07.783743 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerName="glance-log" containerID="cri-o://b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674" gracePeriod=30 Dec 17 09:26:07 crc kubenswrapper[4935]: I1217 09:26:07.783902 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerName="glance-httpd" containerID="cri-o://f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca" gracePeriod=30 Dec 17 09:26:07 crc kubenswrapper[4935]: I1217 09:26:07.787091 4935 generic.go:334] "Generic (PLEG): container finished" podID="e9add23c-5f09-479c-94f8-4e4c35af6dde" containerID="6da47cc7beed25361c935a9c4b27ad95a0c65cf8a0ff3a41da9dc1d4f70418a7" exitCode=0 Dec 17 09:26:07 crc kubenswrapper[4935]: I1217 09:26:07.787173 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hk94m" event={"ID":"e9add23c-5f09-479c-94f8-4e4c35af6dde","Type":"ContainerDied","Data":"6da47cc7beed25361c935a9c4b27ad95a0c65cf8a0ff3a41da9dc1d4f70418a7"} Dec 17 09:26:08 crc kubenswrapper[4935]: I1217 09:26:08.824156 4935 generic.go:334] "Generic (PLEG): container finished" podID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerID="4041f8cace17012f7f4c66cb3e0ee5ea722d274bd653c1f0a2bb28dd153aeb28" exitCode=0 Dec 17 09:26:08 crc kubenswrapper[4935]: I1217 09:26:08.824587 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ca17879-b1f6-4a4b-84e4-94e44e79064e","Type":"ContainerDied","Data":"4041f8cace17012f7f4c66cb3e0ee5ea722d274bd653c1f0a2bb28dd153aeb28"} Dec 17 09:26:08 crc kubenswrapper[4935]: I1217 09:26:08.836202 4935 generic.go:334] "Generic (PLEG): container finished" podID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerID="b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674" exitCode=143 Dec 17 09:26:08 crc kubenswrapper[4935]: I1217 09:26:08.836367 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"abc82931-62eb-42ba-952d-9b1c99f2fd25","Type":"ContainerDied","Data":"b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674"} Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.038763 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.093989 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.094350 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" containerName="glance-log" containerID="cri-o://32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d" gracePeriod=30 Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.094923 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" containerName="glance-httpd" containerID="cri-o://9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280" gracePeriod=30 Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.129202 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-run-httpd\") pod \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.129240 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-log-httpd\") pod \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.129399 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-scripts\") pod \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.129495 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-combined-ca-bundle\") pod \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.129629 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-config-data\") pod \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.129653 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-sg-core-conf-yaml\") pod \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.129756 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2vcb\" (UniqueName: \"kubernetes.io/projected/3ca17879-b1f6-4a4b-84e4-94e44e79064e-kube-api-access-p2vcb\") pod \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\" (UID: \"3ca17879-b1f6-4a4b-84e4-94e44e79064e\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.130485 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3ca17879-b1f6-4a4b-84e4-94e44e79064e" (UID: "3ca17879-b1f6-4a4b-84e4-94e44e79064e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.130938 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3ca17879-b1f6-4a4b-84e4-94e44e79064e" (UID: "3ca17879-b1f6-4a4b-84e4-94e44e79064e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.131249 4935 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.131314 4935 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3ca17879-b1f6-4a4b-84e4-94e44e79064e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.160529 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-scripts" (OuterVolumeSpecName: "scripts") pod "3ca17879-b1f6-4a4b-84e4-94e44e79064e" (UID: "3ca17879-b1f6-4a4b-84e4-94e44e79064e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.160692 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ca17879-b1f6-4a4b-84e4-94e44e79064e-kube-api-access-p2vcb" (OuterVolumeSpecName: "kube-api-access-p2vcb") pod "3ca17879-b1f6-4a4b-84e4-94e44e79064e" (UID: "3ca17879-b1f6-4a4b-84e4-94e44e79064e"). InnerVolumeSpecName "kube-api-access-p2vcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.191829 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3ca17879-b1f6-4a4b-84e4-94e44e79064e" (UID: "3ca17879-b1f6-4a4b-84e4-94e44e79064e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.234850 4935 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.234892 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2vcb\" (UniqueName: \"kubernetes.io/projected/3ca17879-b1f6-4a4b-84e4-94e44e79064e-kube-api-access-p2vcb\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.234909 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.284398 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ca17879-b1f6-4a4b-84e4-94e44e79064e" (UID: "3ca17879-b1f6-4a4b-84e4-94e44e79064e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.310912 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-config-data" (OuterVolumeSpecName: "config-data") pod "3ca17879-b1f6-4a4b-84e4-94e44e79064e" (UID: "3ca17879-b1f6-4a4b-84e4-94e44e79064e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.313125 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.337898 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.337967 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ca17879-b1f6-4a4b-84e4-94e44e79064e-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.438944 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-scripts\") pod \"e9add23c-5f09-479c-94f8-4e4c35af6dde\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.439075 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-combined-ca-bundle\") pod \"e9add23c-5f09-479c-94f8-4e4c35af6dde\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.439310 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-config-data\") pod \"e9add23c-5f09-479c-94f8-4e4c35af6dde\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.439371 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw99p\" (UniqueName: \"kubernetes.io/projected/e9add23c-5f09-479c-94f8-4e4c35af6dde-kube-api-access-tw99p\") pod \"e9add23c-5f09-479c-94f8-4e4c35af6dde\" (UID: \"e9add23c-5f09-479c-94f8-4e4c35af6dde\") " Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.443169 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9add23c-5f09-479c-94f8-4e4c35af6dde-kube-api-access-tw99p" (OuterVolumeSpecName: "kube-api-access-tw99p") pod "e9add23c-5f09-479c-94f8-4e4c35af6dde" (UID: "e9add23c-5f09-479c-94f8-4e4c35af6dde"). InnerVolumeSpecName "kube-api-access-tw99p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.445267 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-scripts" (OuterVolumeSpecName: "scripts") pod "e9add23c-5f09-479c-94f8-4e4c35af6dde" (UID: "e9add23c-5f09-479c-94f8-4e4c35af6dde"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.475830 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-config-data" (OuterVolumeSpecName: "config-data") pod "e9add23c-5f09-479c-94f8-4e4c35af6dde" (UID: "e9add23c-5f09-479c-94f8-4e4c35af6dde"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.478036 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9add23c-5f09-479c-94f8-4e4c35af6dde" (UID: "e9add23c-5f09-479c-94f8-4e4c35af6dde"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.543164 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.543215 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.543229 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9add23c-5f09-479c-94f8-4e4c35af6dde-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.543240 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw99p\" (UniqueName: \"kubernetes.io/projected/e9add23c-5f09-479c-94f8-4e4c35af6dde-kube-api-access-tw99p\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.847975 4935 generic.go:334] "Generic (PLEG): container finished" podID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" containerID="32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d" exitCode=143 Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.848049 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99a5eb5b-829d-49da-a3d1-770d615b9c5a","Type":"ContainerDied","Data":"32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d"} Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.851023 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3ca17879-b1f6-4a4b-84e4-94e44e79064e","Type":"ContainerDied","Data":"46163139fa5bd609af098d83f88d661e59a8066f1e3109d4dfa754a1ef6b481e"} Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.851051 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.851085 4935 scope.go:117] "RemoveContainer" containerID="9846a24b830a6d783c034c0c4e309c0ad699262f7382a3cebd488956869c13f8" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.855129 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-hk94m" event={"ID":"e9add23c-5f09-479c-94f8-4e4c35af6dde","Type":"ContainerDied","Data":"a669e9e7b49a1f654769027cbdd7934275ebc0017e57b221d34e6eb8eab5b4dd"} Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.855298 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a669e9e7b49a1f654769027cbdd7934275ebc0017e57b221d34e6eb8eab5b4dd" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.855238 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-hk94m" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.881210 4935 scope.go:117] "RemoveContainer" containerID="93154e65932f67853ef03c878ca2049e0c30a893ac93207360598fc529f11812" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.910946 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.924322 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.935601 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 17 09:26:09 crc kubenswrapper[4935]: E1217 09:26:09.938524 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="ceilometer-notification-agent" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.938551 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="ceilometer-notification-agent" Dec 17 09:26:09 crc kubenswrapper[4935]: E1217 09:26:09.938565 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="ceilometer-central-agent" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.938573 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="ceilometer-central-agent" Dec 17 09:26:09 crc kubenswrapper[4935]: E1217 09:26:09.938599 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="proxy-httpd" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.938606 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="proxy-httpd" Dec 17 09:26:09 crc kubenswrapper[4935]: E1217 09:26:09.938620 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9add23c-5f09-479c-94f8-4e4c35af6dde" containerName="nova-cell0-conductor-db-sync" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.938629 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9add23c-5f09-479c-94f8-4e4c35af6dde" containerName="nova-cell0-conductor-db-sync" Dec 17 09:26:09 crc kubenswrapper[4935]: E1217 09:26:09.938640 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="sg-core" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.938647 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="sg-core" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.938857 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="ceilometer-central-agent" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.938871 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="sg-core" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.938885 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9add23c-5f09-479c-94f8-4e4c35af6dde" containerName="nova-cell0-conductor-db-sync" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.938896 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="ceilometer-notification-agent" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.938908 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" containerName="proxy-httpd" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.939639 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.942901 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.943079 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-q2l67" Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.947429 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 17 09:26:09 crc kubenswrapper[4935]: I1217 09:26:09.997648 4935 scope.go:117] "RemoveContainer" containerID="719ac1c2785181fc1cd86816cf2dfee2ce430d929efae8f5926f6799c86b3d2f" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.045609 4935 scope.go:117] "RemoveContainer" containerID="4041f8cace17012f7f4c66cb3e0ee5ea722d274bd653c1f0a2bb28dd153aeb28" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.050945 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.055498 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.058501 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc3ff0-93d5-4ec4-b843-496b06524eb0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2efc3ff0-93d5-4ec4-b843-496b06524eb0\") " pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.058547 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc3ff0-93d5-4ec4-b843-496b06524eb0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2efc3ff0-93d5-4ec4-b843-496b06524eb0\") " pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.058711 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdq8z\" (UniqueName: \"kubernetes.io/projected/2efc3ff0-93d5-4ec4-b843-496b06524eb0-kube-api-access-kdq8z\") pod \"nova-cell0-conductor-0\" (UID: \"2efc3ff0-93d5-4ec4-b843-496b06524eb0\") " pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.061576 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.064866 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.070964 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.161031 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-scripts\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.161084 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.161128 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdq8z\" (UniqueName: \"kubernetes.io/projected/2efc3ff0-93d5-4ec4-b843-496b06524eb0-kube-api-access-kdq8z\") pod \"nova-cell0-conductor-0\" (UID: \"2efc3ff0-93d5-4ec4-b843-496b06524eb0\") " pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.161168 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-run-httpd\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.161196 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc3ff0-93d5-4ec4-b843-496b06524eb0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2efc3ff0-93d5-4ec4-b843-496b06524eb0\") " pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.161214 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-log-httpd\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.161232 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc3ff0-93d5-4ec4-b843-496b06524eb0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2efc3ff0-93d5-4ec4-b843-496b06524eb0\") " pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.161258 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.161335 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-config-data\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.161363 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lxr5\" (UniqueName: \"kubernetes.io/projected/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-kube-api-access-7lxr5\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.167875 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc3ff0-93d5-4ec4-b843-496b06524eb0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2efc3ff0-93d5-4ec4-b843-496b06524eb0\") " pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.167883 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc3ff0-93d5-4ec4-b843-496b06524eb0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2efc3ff0-93d5-4ec4-b843-496b06524eb0\") " pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.183953 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdq8z\" (UniqueName: \"kubernetes.io/projected/2efc3ff0-93d5-4ec4-b843-496b06524eb0-kube-api-access-kdq8z\") pod \"nova-cell0-conductor-0\" (UID: \"2efc3ff0-93d5-4ec4-b843-496b06524eb0\") " pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.262857 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-scripts\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.262918 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.262980 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-run-httpd\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.263012 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-log-httpd\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.263042 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.263081 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-config-data\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.263107 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lxr5\" (UniqueName: \"kubernetes.io/projected/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-kube-api-access-7lxr5\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.263923 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-log-httpd\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.263979 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-run-httpd\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.268644 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.269083 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-scripts\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.269819 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.269993 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-config-data\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.293162 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lxr5\" (UniqueName: \"kubernetes.io/projected/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-kube-api-access-7lxr5\") pod \"ceilometer-0\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.307500 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.377679 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.861776 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.955991 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.148:9292/healthcheck\": read tcp 10.217.0.2:47356->10.217.0.148:9292: read: connection reset by peer" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.956071 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.148:9292/healthcheck\": read tcp 10.217.0.2:47344->10.217.0.148:9292: read: connection reset by peer" Dec 17 09:26:10 crc kubenswrapper[4935]: I1217 09:26:10.987442 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.138232 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ca17879-b1f6-4a4b-84e4-94e44e79064e" path="/var/lib/kubelet/pods/3ca17879-b1f6-4a4b-84e4-94e44e79064e/volumes" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.530592 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.589642 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"abc82931-62eb-42ba-952d-9b1c99f2fd25\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.589752 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-logs\") pod \"abc82931-62eb-42ba-952d-9b1c99f2fd25\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.589786 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-scripts\") pod \"abc82931-62eb-42ba-952d-9b1c99f2fd25\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.589830 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-combined-ca-bundle\") pod \"abc82931-62eb-42ba-952d-9b1c99f2fd25\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.589891 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-httpd-run\") pod \"abc82931-62eb-42ba-952d-9b1c99f2fd25\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.589939 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xwjs\" (UniqueName: \"kubernetes.io/projected/abc82931-62eb-42ba-952d-9b1c99f2fd25-kube-api-access-9xwjs\") pod \"abc82931-62eb-42ba-952d-9b1c99f2fd25\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.589958 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-public-tls-certs\") pod \"abc82931-62eb-42ba-952d-9b1c99f2fd25\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.589981 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-config-data\") pod \"abc82931-62eb-42ba-952d-9b1c99f2fd25\" (UID: \"abc82931-62eb-42ba-952d-9b1c99f2fd25\") " Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.590400 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-logs" (OuterVolumeSpecName: "logs") pod "abc82931-62eb-42ba-952d-9b1c99f2fd25" (UID: "abc82931-62eb-42ba-952d-9b1c99f2fd25"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.590711 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "abc82931-62eb-42ba-952d-9b1c99f2fd25" (UID: "abc82931-62eb-42ba-952d-9b1c99f2fd25"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.597574 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "abc82931-62eb-42ba-952d-9b1c99f2fd25" (UID: "abc82931-62eb-42ba-952d-9b1c99f2fd25"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.598619 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc82931-62eb-42ba-952d-9b1c99f2fd25-kube-api-access-9xwjs" (OuterVolumeSpecName: "kube-api-access-9xwjs") pod "abc82931-62eb-42ba-952d-9b1c99f2fd25" (UID: "abc82931-62eb-42ba-952d-9b1c99f2fd25"). InnerVolumeSpecName "kube-api-access-9xwjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.600611 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-scripts" (OuterVolumeSpecName: "scripts") pod "abc82931-62eb-42ba-952d-9b1c99f2fd25" (UID: "abc82931-62eb-42ba-952d-9b1c99f2fd25"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.635793 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abc82931-62eb-42ba-952d-9b1c99f2fd25" (UID: "abc82931-62eb-42ba-952d-9b1c99f2fd25"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.653215 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "abc82931-62eb-42ba-952d-9b1c99f2fd25" (UID: "abc82931-62eb-42ba-952d-9b1c99f2fd25"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.660042 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-config-data" (OuterVolumeSpecName: "config-data") pod "abc82931-62eb-42ba-952d-9b1c99f2fd25" (UID: "abc82931-62eb-42ba-952d-9b1c99f2fd25"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.693057 4935 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.693101 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.693113 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.693124 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.693140 4935 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/abc82931-62eb-42ba-952d-9b1c99f2fd25-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.693147 4935 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.693156 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xwjs\" (UniqueName: \"kubernetes.io/projected/abc82931-62eb-42ba-952d-9b1c99f2fd25-kube-api-access-9xwjs\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.693166 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc82931-62eb-42ba-952d-9b1c99f2fd25-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.713361 4935 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.795116 4935 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.878212 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2efc3ff0-93d5-4ec4-b843-496b06524eb0","Type":"ContainerStarted","Data":"ecdb8ad85f4a49ef838ef96a39043dfb12d70f78c98e124cbc0c467dc7f52dbe"} Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.878288 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2efc3ff0-93d5-4ec4-b843-496b06524eb0","Type":"ContainerStarted","Data":"14a20aaa414147c4fb463ecdb2570d8a9257da170bab9f14c0e83640b65e2ce3"} Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.878406 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.883101 4935 generic.go:334] "Generic (PLEG): container finished" podID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerID="f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca" exitCode=0 Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.883159 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.883232 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"abc82931-62eb-42ba-952d-9b1c99f2fd25","Type":"ContainerDied","Data":"f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca"} Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.883339 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"abc82931-62eb-42ba-952d-9b1c99f2fd25","Type":"ContainerDied","Data":"989501f96fa51bfbca20816f711b5f4ad320a7342a9db935d6aa808294b5d37f"} Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.883383 4935 scope.go:117] "RemoveContainer" containerID="f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.885060 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec","Type":"ContainerStarted","Data":"5ff50076dd125443ae23cb6bdd7abffa1e8ac490448cc0ccc33b53c408437af9"} Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.900450 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.900419516 podStartE2EDuration="2.900419516s" podCreationTimestamp="2025-12-17 09:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:11.893807534 +0000 UTC m=+1291.553648307" watchObservedRunningTime="2025-12-17 09:26:11.900419516 +0000 UTC m=+1291.560260289" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.918742 4935 scope.go:117] "RemoveContainer" containerID="b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674" Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.931709 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:26:11 crc kubenswrapper[4935]: I1217 09:26:11.947957 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.033320 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:26:12 crc kubenswrapper[4935]: E1217 09:26:12.034532 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerName="glance-httpd" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.034558 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerName="glance-httpd" Dec 17 09:26:12 crc kubenswrapper[4935]: E1217 09:26:12.034591 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerName="glance-log" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.034598 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerName="glance-log" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.035103 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerName="glance-httpd" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.035144 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" containerName="glance-log" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.037131 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.054027 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.054253 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.054936 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.104560 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.104659 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.104944 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba46087-25fc-485e-b41e-7e55dbd860c6-logs\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.105039 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.105518 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.105910 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.105956 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btmfh\" (UniqueName: \"kubernetes.io/projected/2ba46087-25fc-485e-b41e-7e55dbd860c6-kube-api-access-btmfh\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.106140 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ba46087-25fc-485e-b41e-7e55dbd860c6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.119563 4935 scope.go:117] "RemoveContainer" containerID="f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca" Dec 17 09:26:12 crc kubenswrapper[4935]: E1217 09:26:12.125103 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca\": container with ID starting with f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca not found: ID does not exist" containerID="f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.125172 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca"} err="failed to get container status \"f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca\": rpc error: code = NotFound desc = could not find container \"f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca\": container with ID starting with f7cb78ccd1f739920b35d16dd023106d6bf3308d0d4c8e2aaff46da670d830ca not found: ID does not exist" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.125364 4935 scope.go:117] "RemoveContainer" containerID="b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674" Dec 17 09:26:12 crc kubenswrapper[4935]: E1217 09:26:12.126777 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674\": container with ID starting with b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674 not found: ID does not exist" containerID="b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.126814 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674"} err="failed to get container status \"b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674\": rpc error: code = NotFound desc = could not find container \"b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674\": container with ID starting with b957be5e1d6afb723678c9986a1067267818d7f74857e494a97119fe0b8c5674 not found: ID does not exist" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.209579 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ba46087-25fc-485e-b41e-7e55dbd860c6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.208696 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2ba46087-25fc-485e-b41e-7e55dbd860c6-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.211221 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.212512 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.212868 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba46087-25fc-485e-b41e-7e55dbd860c6-logs\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.213012 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.213191 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.213457 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ba46087-25fc-485e-b41e-7e55dbd860c6-logs\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.212908 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.215588 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.215775 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btmfh\" (UniqueName: \"kubernetes.io/projected/2ba46087-25fc-485e-b41e-7e55dbd860c6-kube-api-access-btmfh\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.219252 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.225234 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.233992 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-config-data\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.234449 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ba46087-25fc-485e-b41e-7e55dbd860c6-scripts\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.245344 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btmfh\" (UniqueName: \"kubernetes.io/projected/2ba46087-25fc-485e-b41e-7e55dbd860c6-kube-api-access-btmfh\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.258588 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"2ba46087-25fc-485e-b41e-7e55dbd860c6\") " pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.444064 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.828731 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.906577 4935 generic.go:334] "Generic (PLEG): container finished" podID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" containerID="9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280" exitCode=0 Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.906769 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.907618 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99a5eb5b-829d-49da-a3d1-770d615b9c5a","Type":"ContainerDied","Data":"9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280"} Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.907681 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"99a5eb5b-829d-49da-a3d1-770d615b9c5a","Type":"ContainerDied","Data":"f8b4883fe673e97960fc30362e9a5b303f662f818179df8bf170a5e8b24422e5"} Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.907706 4935 scope.go:117] "RemoveContainer" containerID="9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.923242 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec","Type":"ContainerStarted","Data":"355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0"} Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.935116 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.935202 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-scripts\") pod \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.935299 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-logs\") pod \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.935390 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkj7z\" (UniqueName: \"kubernetes.io/projected/99a5eb5b-829d-49da-a3d1-770d615b9c5a-kube-api-access-xkj7z\") pod \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.935433 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-config-data\") pod \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.935472 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-httpd-run\") pod \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.935554 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-internal-tls-certs\") pod \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.935646 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-combined-ca-bundle\") pod \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\" (UID: \"99a5eb5b-829d-49da-a3d1-770d615b9c5a\") " Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.947320 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-logs" (OuterVolumeSpecName: "logs") pod "99a5eb5b-829d-49da-a3d1-770d615b9c5a" (UID: "99a5eb5b-829d-49da-a3d1-770d615b9c5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.947881 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "99a5eb5b-829d-49da-a3d1-770d615b9c5a" (UID: "99a5eb5b-829d-49da-a3d1-770d615b9c5a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.957182 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a5eb5b-829d-49da-a3d1-770d615b9c5a-kube-api-access-xkj7z" (OuterVolumeSpecName: "kube-api-access-xkj7z") pod "99a5eb5b-829d-49da-a3d1-770d615b9c5a" (UID: "99a5eb5b-829d-49da-a3d1-770d615b9c5a"). InnerVolumeSpecName "kube-api-access-xkj7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.958975 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "99a5eb5b-829d-49da-a3d1-770d615b9c5a" (UID: "99a5eb5b-829d-49da-a3d1-770d615b9c5a"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 17 09:26:12 crc kubenswrapper[4935]: I1217 09:26:12.960445 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-scripts" (OuterVolumeSpecName: "scripts") pod "99a5eb5b-829d-49da-a3d1-770d615b9c5a" (UID: "99a5eb5b-829d-49da-a3d1-770d615b9c5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.007736 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "99a5eb5b-829d-49da-a3d1-770d615b9c5a" (UID: "99a5eb5b-829d-49da-a3d1-770d615b9c5a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.040717 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkj7z\" (UniqueName: \"kubernetes.io/projected/99a5eb5b-829d-49da-a3d1-770d615b9c5a-kube-api-access-xkj7z\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.040756 4935 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.040770 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.040802 4935 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.040819 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.040833 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99a5eb5b-829d-49da-a3d1-770d615b9c5a-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.055364 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "99a5eb5b-829d-49da-a3d1-770d615b9c5a" (UID: "99a5eb5b-829d-49da-a3d1-770d615b9c5a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.069565 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-config-data" (OuterVolumeSpecName: "config-data") pod "99a5eb5b-829d-49da-a3d1-770d615b9c5a" (UID: "99a5eb5b-829d-49da-a3d1-770d615b9c5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.082107 4935 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.123930 4935 scope.go:117] "RemoveContainer" containerID="32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.145647 4935 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.145686 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.145701 4935 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99a5eb5b-829d-49da-a3d1-770d615b9c5a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.145949 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc82931-62eb-42ba-952d-9b1c99f2fd25" path="/var/lib/kubelet/pods/abc82931-62eb-42ba-952d-9b1c99f2fd25/volumes" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.161489 4935 scope.go:117] "RemoveContainer" containerID="9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280" Dec 17 09:26:13 crc kubenswrapper[4935]: E1217 09:26:13.163634 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280\": container with ID starting with 9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280 not found: ID does not exist" containerID="9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.163699 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280"} err="failed to get container status \"9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280\": rpc error: code = NotFound desc = could not find container \"9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280\": container with ID starting with 9871589aaa103a1a187a82865c72a6837e3c4465553fbeedf383ede34a5b9280 not found: ID does not exist" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.163734 4935 scope.go:117] "RemoveContainer" containerID="32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d" Dec 17 09:26:13 crc kubenswrapper[4935]: E1217 09:26:13.164134 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d\": container with ID starting with 32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d not found: ID does not exist" containerID="32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.164183 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d"} err="failed to get container status \"32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d\": rpc error: code = NotFound desc = could not find container \"32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d\": container with ID starting with 32ef75e22bac89f5ff4a355111f1119a0ad3be05cbe16f1507145f1d04cbeb0d not found: ID does not exist" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.197991 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.283414 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.303813 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.323488 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:26:13 crc kubenswrapper[4935]: E1217 09:26:13.325537 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" containerName="glance-log" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.325574 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" containerName="glance-log" Dec 17 09:26:13 crc kubenswrapper[4935]: E1217 09:26:13.325597 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" containerName="glance-httpd" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.325604 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" containerName="glance-httpd" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.325870 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" containerName="glance-log" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.325913 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" containerName="glance-httpd" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.336977 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.336977 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.342086 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.342251 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.451796 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.451863 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a918434d-797e-4c05-b048-8a5c5cbc18c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.451944 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a918434d-797e-4c05-b048-8a5c5cbc18c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.451971 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swvrr\" (UniqueName: \"kubernetes.io/projected/a918434d-797e-4c05-b048-8a5c5cbc18c0-kube-api-access-swvrr\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.452903 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.453152 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.453192 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.453292 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.556585 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.556678 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.556766 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a918434d-797e-4c05-b048-8a5c5cbc18c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.556784 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.556873 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a918434d-797e-4c05-b048-8a5c5cbc18c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.556897 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swvrr\" (UniqueName: \"kubernetes.io/projected/a918434d-797e-4c05-b048-8a5c5cbc18c0-kube-api-access-swvrr\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.556932 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.557028 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.560062 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.560782 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a918434d-797e-4c05-b048-8a5c5cbc18c0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.561538 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a918434d-797e-4c05-b048-8a5c5cbc18c0-logs\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.565123 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.567139 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.568967 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.572452 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a918434d-797e-4c05-b048-8a5c5cbc18c0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.583206 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swvrr\" (UniqueName: \"kubernetes.io/projected/a918434d-797e-4c05-b048-8a5c5cbc18c0-kube-api-access-swvrr\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.605210 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"a918434d-797e-4c05-b048-8a5c5cbc18c0\") " pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.682040 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.945697 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec","Type":"ContainerStarted","Data":"f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47"} Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.946371 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec","Type":"ContainerStarted","Data":"a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee"} Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.949931 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ba46087-25fc-485e-b41e-7e55dbd860c6","Type":"ContainerStarted","Data":"50ad0dd5b3b6a5a240e37c0af61c85b81fea2a671cba12b2ffafad50888bfd7a"} Dec 17 09:26:13 crc kubenswrapper[4935]: I1217 09:26:13.949969 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ba46087-25fc-485e-b41e-7e55dbd860c6","Type":"ContainerStarted","Data":"9f6ff5485f4a92caaefae4d72b64a4567c8196d655bcc7a687857eb8054955cf"} Dec 17 09:26:14 crc kubenswrapper[4935]: I1217 09:26:14.293738 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 17 09:26:14 crc kubenswrapper[4935]: I1217 09:26:14.976143 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2ba46087-25fc-485e-b41e-7e55dbd860c6","Type":"ContainerStarted","Data":"ae44defd751c56413fb7592a0f74779b3db575e09f347e449efb9112d15a6872"} Dec 17 09:26:14 crc kubenswrapper[4935]: I1217 09:26:14.982755 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a918434d-797e-4c05-b048-8a5c5cbc18c0","Type":"ContainerStarted","Data":"c8f461c3321f2c079b22bc7d7be7c3ccd11636135670168b345b866b40d848f4"} Dec 17 09:26:14 crc kubenswrapper[4935]: I1217 09:26:14.982804 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a918434d-797e-4c05-b048-8a5c5cbc18c0","Type":"ContainerStarted","Data":"5a537efad24d8235543224a0b589162ce76193c5faaec6a7639fa868dea86455"} Dec 17 09:26:15 crc kubenswrapper[4935]: I1217 09:26:15.006941 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.006917581 podStartE2EDuration="4.006917581s" podCreationTimestamp="2025-12-17 09:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:14.997192054 +0000 UTC m=+1294.657032817" watchObservedRunningTime="2025-12-17 09:26:15.006917581 +0000 UTC m=+1294.666758344" Dec 17 09:26:15 crc kubenswrapper[4935]: I1217 09:26:15.140322 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a5eb5b-829d-49da-a3d1-770d615b9c5a" path="/var/lib/kubelet/pods/99a5eb5b-829d-49da-a3d1-770d615b9c5a/volumes" Dec 17 09:26:15 crc kubenswrapper[4935]: I1217 09:26:15.995987 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec","Type":"ContainerStarted","Data":"147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7"} Dec 17 09:26:15 crc kubenswrapper[4935]: I1217 09:26:15.996911 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 17 09:26:16 crc kubenswrapper[4935]: I1217 09:26:16.000455 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a918434d-797e-4c05-b048-8a5c5cbc18c0","Type":"ContainerStarted","Data":"2530a1bbce9ab992f591ae522d3174b281ef1dcb4865af0117da4908fee9c417"} Dec 17 09:26:16 crc kubenswrapper[4935]: I1217 09:26:16.044525 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.677688017 podStartE2EDuration="7.044506221s" podCreationTimestamp="2025-12-17 09:26:09 +0000 UTC" firstStartedPulling="2025-12-17 09:26:10.992544604 +0000 UTC m=+1290.652385367" lastFinishedPulling="2025-12-17 09:26:15.359362808 +0000 UTC m=+1295.019203571" observedRunningTime="2025-12-17 09:26:16.039536429 +0000 UTC m=+1295.699377192" watchObservedRunningTime="2025-12-17 09:26:16.044506221 +0000 UTC m=+1295.704346984" Dec 17 09:26:16 crc kubenswrapper[4935]: I1217 09:26:16.089705 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.089682554 podStartE2EDuration="3.089682554s" podCreationTimestamp="2025-12-17 09:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:16.083320088 +0000 UTC m=+1295.743160851" watchObservedRunningTime="2025-12-17 09:26:16.089682554 +0000 UTC m=+1295.749523317" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.342454 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.830098 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xbvz5"] Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.831794 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.836982 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.837365 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.841226 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs784\" (UniqueName: \"kubernetes.io/projected/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-kube-api-access-xs784\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.841479 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-scripts\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.841608 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-config-data\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.841929 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.847102 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xbvz5"] Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.945642 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-scripts\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.945721 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-config-data\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.945854 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.945918 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs784\" (UniqueName: \"kubernetes.io/projected/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-kube-api-access-xs784\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.954566 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-scripts\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.961021 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.967340 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-config-data\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:20 crc kubenswrapper[4935]: I1217 09:26:20.981136 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs784\" (UniqueName: \"kubernetes.io/projected/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-kube-api-access-xs784\") pod \"nova-cell0-cell-mapping-xbvz5\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.116740 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.121470 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.127735 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.203766 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.204724 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.204824 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c2e0ef5-7511-4b15-a6ee-2102084a5175-logs\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.204875 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88dnv\" (UniqueName: \"kubernetes.io/projected/0c2e0ef5-7511-4b15-a6ee-2102084a5175-kube-api-access-88dnv\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.204945 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-config-data\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.228753 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.291599 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.293514 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.302395 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.305657 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.307598 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.309289 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.309354 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c2e0ef5-7511-4b15-a6ee-2102084a5175-logs\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.309408 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88dnv\" (UniqueName: \"kubernetes.io/projected/0c2e0ef5-7511-4b15-a6ee-2102084a5175-kube-api-access-88dnv\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.309506 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-config-data\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.311904 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c2e0ef5-7511-4b15-a6ee-2102084a5175-logs\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.320881 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-config-data\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.320977 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.322773 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.330399 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.369255 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88dnv\" (UniqueName: \"kubernetes.io/projected/0c2e0ef5-7511-4b15-a6ee-2102084a5175-kube-api-access-88dnv\") pod \"nova-api-0\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.381921 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.413747 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-config-data\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.413795 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.413861 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wpsc\" (UniqueName: \"kubernetes.io/projected/46c11c10-4947-468c-b8d7-26ad69885f7e-kube-api-access-6wpsc\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.413912 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-config-data\") pod \"nova-scheduler-0\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.413933 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk2n5\" (UniqueName: \"kubernetes.io/projected/f53c7404-83eb-440f-8007-2383fff771c1-kube-api-access-hk2n5\") pod \"nova-scheduler-0\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.413967 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46c11c10-4947-468c-b8d7-26ad69885f7e-logs\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.413996 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.418943 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-f6dp6"] Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.422776 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.442728 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.449866 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.453107 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.461202 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.480646 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-f6dp6"] Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.516857 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.518485 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519003 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519062 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-config-data\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519092 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519151 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wpsc\" (UniqueName: \"kubernetes.io/projected/46c11c10-4947-468c-b8d7-26ad69885f7e-kube-api-access-6wpsc\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519179 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519257 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-config\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519381 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-config-data\") pod \"nova-scheduler-0\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519409 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk2n5\" (UniqueName: \"kubernetes.io/projected/f53c7404-83eb-440f-8007-2383fff771c1-kube-api-access-hk2n5\") pod \"nova-scheduler-0\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519497 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2vhs\" (UniqueName: \"kubernetes.io/projected/88cf002b-f718-4ada-9094-0484202077cb-kube-api-access-x2vhs\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519553 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519597 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46c11c10-4947-468c-b8d7-26ad69885f7e-logs\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519641 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519680 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqkq\" (UniqueName: \"kubernetes.io/projected/dbff5cff-ec58-46d1-b09f-da461bc11b44-kube-api-access-zkqkq\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519717 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.519742 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.521006 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46c11c10-4947-468c-b8d7-26ad69885f7e-logs\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.524483 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-config-data\") pod \"nova-scheduler-0\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.525303 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.527969 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-config-data\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.535989 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.539679 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wpsc\" (UniqueName: \"kubernetes.io/projected/46c11c10-4947-468c-b8d7-26ad69885f7e-kube-api-access-6wpsc\") pod \"nova-metadata-0\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.543977 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk2n5\" (UniqueName: \"kubernetes.io/projected/f53c7404-83eb-440f-8007-2383fff771c1-kube-api-access-hk2n5\") pod \"nova-scheduler-0\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.624581 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqkq\" (UniqueName: \"kubernetes.io/projected/dbff5cff-ec58-46d1-b09f-da461bc11b44-kube-api-access-zkqkq\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.624637 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.624709 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.624732 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.624777 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.624818 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-config\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.624848 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2vhs\" (UniqueName: \"kubernetes.io/projected/88cf002b-f718-4ada-9094-0484202077cb-kube-api-access-x2vhs\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.624874 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.624901 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.626489 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-config\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.627124 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.627255 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.634382 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.635733 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.635977 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.641015 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.643207 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqkq\" (UniqueName: \"kubernetes.io/projected/dbff5cff-ec58-46d1-b09f-da461bc11b44-kube-api-access-zkqkq\") pod \"nova-cell1-novncproxy-0\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.649320 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2vhs\" (UniqueName: \"kubernetes.io/projected/88cf002b-f718-4ada-9094-0484202077cb-kube-api-access-x2vhs\") pod \"dnsmasq-dns-557bbc7df7-f6dp6\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.727534 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.748948 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.765614 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.800187 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:21 crc kubenswrapper[4935]: I1217 09:26:21.939487 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xbvz5"] Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.125065 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:26:22 crc kubenswrapper[4935]: W1217 09:26:22.173063 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c2e0ef5_7511_4b15_a6ee_2102084a5175.slice/crio-f8c544715a3d0d35c29a314bb8783ca9bd38158a7ac0ccd37b842bfdd23f739d WatchSource:0}: Error finding container f8c544715a3d0d35c29a314bb8783ca9bd38158a7ac0ccd37b842bfdd23f739d: Status 404 returned error can't find the container with id f8c544715a3d0d35c29a314bb8783ca9bd38158a7ac0ccd37b842bfdd23f739d Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.179116 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xbvz5" event={"ID":"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06","Type":"ContainerStarted","Data":"296d4523f8cfabffefddc6a1b6f56b1e02168e289ea1e05a301217e2594478df"} Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.218646 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xb85g"] Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.220136 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.227491 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.228114 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.240705 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xb85g"] Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.342746 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-config-data\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.342800 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.342938 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-scripts\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.342964 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v774l\" (UniqueName: \"kubernetes.io/projected/6288f109-f80f-4cd2-a928-914d30835d20-kube-api-access-v774l\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.349554 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:26:22 crc kubenswrapper[4935]: W1217 09:26:22.382447 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf53c7404_83eb_440f_8007_2383fff771c1.slice/crio-5e8f7aaf6b27f1dac54284b6160ab3f84d7ccda0b8e5868bc2931aef18ffeb38 WatchSource:0}: Error finding container 5e8f7aaf6b27f1dac54284b6160ab3f84d7ccda0b8e5868bc2931aef18ffeb38: Status 404 returned error can't find the container with id 5e8f7aaf6b27f1dac54284b6160ab3f84d7ccda0b8e5868bc2931aef18ffeb38 Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.386559 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:22 crc kubenswrapper[4935]: W1217 09:26:22.389381 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46c11c10_4947_468c_b8d7_26ad69885f7e.slice/crio-3036759e024ba3b74d3ba91f3379886100932c2bdfd7b363566b073fa2ddbf58 WatchSource:0}: Error finding container 3036759e024ba3b74d3ba91f3379886100932c2bdfd7b363566b073fa2ddbf58: Status 404 returned error can't find the container with id 3036759e024ba3b74d3ba91f3379886100932c2bdfd7b363566b073fa2ddbf58 Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.445080 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-scripts\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.445137 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v774l\" (UniqueName: \"kubernetes.io/projected/6288f109-f80f-4cd2-a928-914d30835d20-kube-api-access-v774l\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.445225 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-config-data\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.445257 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.447904 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.448225 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.453315 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-scripts\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.453569 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.461734 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-config-data\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.469843 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v774l\" (UniqueName: \"kubernetes.io/projected/6288f109-f80f-4cd2-a928-914d30835d20-kube-api-access-v774l\") pod \"nova-cell1-conductor-db-sync-xb85g\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.502662 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.515094 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.523413 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-f6dp6"] Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.564380 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:22 crc kubenswrapper[4935]: I1217 09:26:22.654056 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.098457 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xb85g"] Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.228768 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f53c7404-83eb-440f-8007-2383fff771c1","Type":"ContainerStarted","Data":"5e8f7aaf6b27f1dac54284b6160ab3f84d7ccda0b8e5868bc2931aef18ffeb38"} Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.245836 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46c11c10-4947-468c-b8d7-26ad69885f7e","Type":"ContainerStarted","Data":"3036759e024ba3b74d3ba91f3379886100932c2bdfd7b363566b073fa2ddbf58"} Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.253070 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c2e0ef5-7511-4b15-a6ee-2102084a5175","Type":"ContainerStarted","Data":"f8c544715a3d0d35c29a314bb8783ca9bd38158a7ac0ccd37b842bfdd23f739d"} Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.255318 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xb85g" event={"ID":"6288f109-f80f-4cd2-a928-914d30835d20","Type":"ContainerStarted","Data":"06b45fe5fb7cff5d72c49d3fd5016e74591c646a6e1bfb781e1c841c06161fcb"} Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.257760 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xbvz5" event={"ID":"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06","Type":"ContainerStarted","Data":"94a2119c0b9f91dc3fdd481b82c2852972ff9c3e055a8ccd82efc3f67a18b281"} Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.283252 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dbff5cff-ec58-46d1-b09f-da461bc11b44","Type":"ContainerStarted","Data":"97421659de938e359a6c58469d5a90029a89cccac6b72fb8ba1e01825a22491c"} Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.288409 4935 generic.go:334] "Generic (PLEG): container finished" podID="88cf002b-f718-4ada-9094-0484202077cb" containerID="e47d085119a866b8185db346dc6ad1dfc007295885ef5bdc9435a114646ba4ce" exitCode=0 Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.288469 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" event={"ID":"88cf002b-f718-4ada-9094-0484202077cb","Type":"ContainerDied","Data":"e47d085119a866b8185db346dc6ad1dfc007295885ef5bdc9435a114646ba4ce"} Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.288541 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" event={"ID":"88cf002b-f718-4ada-9094-0484202077cb","Type":"ContainerStarted","Data":"33265885f0c888f0d2dbf2527d31706b1a58a65ac8c6fb2ed85ad83dd840e84f"} Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.289083 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.289768 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.296171 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xbvz5" podStartSLOduration=3.296153114 podStartE2EDuration="3.296153114s" podCreationTimestamp="2025-12-17 09:26:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:23.280180617 +0000 UTC m=+1302.940021390" watchObservedRunningTime="2025-12-17 09:26:23.296153114 +0000 UTC m=+1302.955993877" Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.682563 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.683126 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.725604 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:23 crc kubenswrapper[4935]: I1217 09:26:23.748745 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:24 crc kubenswrapper[4935]: I1217 09:26:24.303555 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" event={"ID":"88cf002b-f718-4ada-9094-0484202077cb","Type":"ContainerStarted","Data":"ca68bdfd42610ebd1d2678e1de14a6857034a15a6097a7cb31f9c7d0e02237cc"} Dec 17 09:26:24 crc kubenswrapper[4935]: I1217 09:26:24.304098 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:24 crc kubenswrapper[4935]: I1217 09:26:24.308960 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xb85g" event={"ID":"6288f109-f80f-4cd2-a928-914d30835d20","Type":"ContainerStarted","Data":"324a498ea9a803deea163bfc5378483bbb68558bc64874d4b0ec42f82ff2b312"} Dec 17 09:26:24 crc kubenswrapper[4935]: I1217 09:26:24.309958 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:24 crc kubenswrapper[4935]: I1217 09:26:24.310016 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:24 crc kubenswrapper[4935]: I1217 09:26:24.342657 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" podStartSLOduration=3.342620293 podStartE2EDuration="3.342620293s" podCreationTimestamp="2025-12-17 09:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:24.330622593 +0000 UTC m=+1303.990463356" watchObservedRunningTime="2025-12-17 09:26:24.342620293 +0000 UTC m=+1304.002461046" Dec 17 09:26:24 crc kubenswrapper[4935]: I1217 09:26:24.367526 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xb85g" podStartSLOduration=2.367494626 podStartE2EDuration="2.367494626s" podCreationTimestamp="2025-12-17 09:26:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:24.355381552 +0000 UTC m=+1304.015222325" watchObservedRunningTime="2025-12-17 09:26:24.367494626 +0000 UTC m=+1304.027335389" Dec 17 09:26:25 crc kubenswrapper[4935]: I1217 09:26:25.224367 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 17 09:26:25 crc kubenswrapper[4935]: I1217 09:26:25.231805 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:25 crc kubenswrapper[4935]: I1217 09:26:25.323462 4935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 17 09:26:25 crc kubenswrapper[4935]: I1217 09:26:25.323509 4935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 17 09:26:25 crc kubenswrapper[4935]: I1217 09:26:25.437962 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 17 09:26:25 crc kubenswrapper[4935]: I1217 09:26:25.442215 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 17 09:26:26 crc kubenswrapper[4935]: I1217 09:26:26.738510 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:26 crc kubenswrapper[4935]: I1217 09:26:26.739613 4935 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 17 09:26:26 crc kubenswrapper[4935]: I1217 09:26:26.773887 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.360300 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c2e0ef5-7511-4b15-a6ee-2102084a5175","Type":"ContainerStarted","Data":"7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3"} Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.360376 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c2e0ef5-7511-4b15-a6ee-2102084a5175","Type":"ContainerStarted","Data":"a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9"} Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.368989 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dbff5cff-ec58-46d1-b09f-da461bc11b44","Type":"ContainerStarted","Data":"d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c"} Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.369139 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="dbff5cff-ec58-46d1-b09f-da461bc11b44" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c" gracePeriod=30 Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.372002 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f53c7404-83eb-440f-8007-2383fff771c1","Type":"ContainerStarted","Data":"683f5c56bb0ec2f457e238de44435d8b767fd3f52080670f29eb4539fafd6387"} Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.385677 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.287170327 podStartE2EDuration="6.385653434s" podCreationTimestamp="2025-12-17 09:26:21 +0000 UTC" firstStartedPulling="2025-12-17 09:26:22.225035348 +0000 UTC m=+1301.884876121" lastFinishedPulling="2025-12-17 09:26:26.323518465 +0000 UTC m=+1305.983359228" observedRunningTime="2025-12-17 09:26:27.38139234 +0000 UTC m=+1307.041233103" watchObservedRunningTime="2025-12-17 09:26:27.385653434 +0000 UTC m=+1307.045494197" Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.401478 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="46c11c10-4947-468c-b8d7-26ad69885f7e" containerName="nova-metadata-log" containerID="cri-o://a845ffd3dd54443c55e4c08f1af205e618ab85eb14f5a0aa78817979f96c8d9a" gracePeriod=30 Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.401809 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="46c11c10-4947-468c-b8d7-26ad69885f7e" containerName="nova-metadata-metadata" containerID="cri-o://047c4973b6c22a4f3e3a60d72335c1ea6461014d0892c07ae5bb8a15bb20a150" gracePeriod=30 Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.401910 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46c11c10-4947-468c-b8d7-26ad69885f7e","Type":"ContainerStarted","Data":"047c4973b6c22a4f3e3a60d72335c1ea6461014d0892c07ae5bb8a15bb20a150"} Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.401941 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46c11c10-4947-468c-b8d7-26ad69885f7e","Type":"ContainerStarted","Data":"a845ffd3dd54443c55e4c08f1af205e618ab85eb14f5a0aa78817979f96c8d9a"} Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.463769 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.809179902 podStartE2EDuration="6.463731166s" podCreationTimestamp="2025-12-17 09:26:21 +0000 UTC" firstStartedPulling="2025-12-17 09:26:22.673290025 +0000 UTC m=+1302.333130788" lastFinishedPulling="2025-12-17 09:26:26.327841289 +0000 UTC m=+1305.987682052" observedRunningTime="2025-12-17 09:26:27.456598933 +0000 UTC m=+1307.116439686" watchObservedRunningTime="2025-12-17 09:26:27.463731166 +0000 UTC m=+1307.123571919" Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.564987 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.631448023 podStartE2EDuration="6.5649548s" podCreationTimestamp="2025-12-17 09:26:21 +0000 UTC" firstStartedPulling="2025-12-17 09:26:22.390071339 +0000 UTC m=+1302.049912102" lastFinishedPulling="2025-12-17 09:26:26.323578116 +0000 UTC m=+1305.983418879" observedRunningTime="2025-12-17 09:26:27.484310905 +0000 UTC m=+1307.144151678" watchObservedRunningTime="2025-12-17 09:26:27.5649548 +0000 UTC m=+1307.224795563" Dec 17 09:26:27 crc kubenswrapper[4935]: I1217 09:26:27.592449 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.635340257 podStartE2EDuration="6.592421505s" podCreationTimestamp="2025-12-17 09:26:21 +0000 UTC" firstStartedPulling="2025-12-17 09:26:22.393364159 +0000 UTC m=+1302.053204922" lastFinishedPulling="2025-12-17 09:26:26.350445407 +0000 UTC m=+1306.010286170" observedRunningTime="2025-12-17 09:26:27.544693959 +0000 UTC m=+1307.204534722" watchObservedRunningTime="2025-12-17 09:26:27.592421505 +0000 UTC m=+1307.252262258" Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.412985 4935 generic.go:334] "Generic (PLEG): container finished" podID="46c11c10-4947-468c-b8d7-26ad69885f7e" containerID="047c4973b6c22a4f3e3a60d72335c1ea6461014d0892c07ae5bb8a15bb20a150" exitCode=0 Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.413032 4935 generic.go:334] "Generic (PLEG): container finished" podID="46c11c10-4947-468c-b8d7-26ad69885f7e" containerID="a845ffd3dd54443c55e4c08f1af205e618ab85eb14f5a0aa78817979f96c8d9a" exitCode=143 Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.413066 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46c11c10-4947-468c-b8d7-26ad69885f7e","Type":"ContainerDied","Data":"047c4973b6c22a4f3e3a60d72335c1ea6461014d0892c07ae5bb8a15bb20a150"} Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.413125 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46c11c10-4947-468c-b8d7-26ad69885f7e","Type":"ContainerDied","Data":"a845ffd3dd54443c55e4c08f1af205e618ab85eb14f5a0aa78817979f96c8d9a"} Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.856821 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.953737 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-config-data\") pod \"46c11c10-4947-468c-b8d7-26ad69885f7e\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.953807 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46c11c10-4947-468c-b8d7-26ad69885f7e-logs\") pod \"46c11c10-4947-468c-b8d7-26ad69885f7e\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.953977 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wpsc\" (UniqueName: \"kubernetes.io/projected/46c11c10-4947-468c-b8d7-26ad69885f7e-kube-api-access-6wpsc\") pod \"46c11c10-4947-468c-b8d7-26ad69885f7e\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.954017 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-combined-ca-bundle\") pod \"46c11c10-4947-468c-b8d7-26ad69885f7e\" (UID: \"46c11c10-4947-468c-b8d7-26ad69885f7e\") " Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.954307 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46c11c10-4947-468c-b8d7-26ad69885f7e-logs" (OuterVolumeSpecName: "logs") pod "46c11c10-4947-468c-b8d7-26ad69885f7e" (UID: "46c11c10-4947-468c-b8d7-26ad69885f7e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.954912 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46c11c10-4947-468c-b8d7-26ad69885f7e-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:28 crc kubenswrapper[4935]: I1217 09:26:28.966576 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46c11c10-4947-468c-b8d7-26ad69885f7e-kube-api-access-6wpsc" (OuterVolumeSpecName: "kube-api-access-6wpsc") pod "46c11c10-4947-468c-b8d7-26ad69885f7e" (UID: "46c11c10-4947-468c-b8d7-26ad69885f7e"). InnerVolumeSpecName "kube-api-access-6wpsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.005598 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-config-data" (OuterVolumeSpecName: "config-data") pod "46c11c10-4947-468c-b8d7-26ad69885f7e" (UID: "46c11c10-4947-468c-b8d7-26ad69885f7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.008365 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46c11c10-4947-468c-b8d7-26ad69885f7e" (UID: "46c11c10-4947-468c-b8d7-26ad69885f7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.056828 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wpsc\" (UniqueName: \"kubernetes.io/projected/46c11c10-4947-468c-b8d7-26ad69885f7e-kube-api-access-6wpsc\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.056872 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.056883 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46c11c10-4947-468c-b8d7-26ad69885f7e-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.430635 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46c11c10-4947-468c-b8d7-26ad69885f7e","Type":"ContainerDied","Data":"3036759e024ba3b74d3ba91f3379886100932c2bdfd7b363566b073fa2ddbf58"} Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.431401 4935 scope.go:117] "RemoveContainer" containerID="047c4973b6c22a4f3e3a60d72335c1ea6461014d0892c07ae5bb8a15bb20a150" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.431683 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.463573 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.464842 4935 scope.go:117] "RemoveContainer" containerID="a845ffd3dd54443c55e4c08f1af205e618ab85eb14f5a0aa78817979f96c8d9a" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.474496 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.510496 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:29 crc kubenswrapper[4935]: E1217 09:26:29.511196 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c11c10-4947-468c-b8d7-26ad69885f7e" containerName="nova-metadata-log" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.511219 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c11c10-4947-468c-b8d7-26ad69885f7e" containerName="nova-metadata-log" Dec 17 09:26:29 crc kubenswrapper[4935]: E1217 09:26:29.511255 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46c11c10-4947-468c-b8d7-26ad69885f7e" containerName="nova-metadata-metadata" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.511262 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="46c11c10-4947-468c-b8d7-26ad69885f7e" containerName="nova-metadata-metadata" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.515807 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c11c10-4947-468c-b8d7-26ad69885f7e" containerName="nova-metadata-log" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.515891 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="46c11c10-4947-468c-b8d7-26ad69885f7e" containerName="nova-metadata-metadata" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.517398 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.521182 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.521424 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.525391 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.609196 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-config-data\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.609261 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5974a9e1-45a2-415c-991d-ccede8264837-logs\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.609328 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.609362 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.609407 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpxwg\" (UniqueName: \"kubernetes.io/projected/5974a9e1-45a2-415c-991d-ccede8264837-kube-api-access-cpxwg\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.711684 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-config-data\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.712743 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5974a9e1-45a2-415c-991d-ccede8264837-logs\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.712804 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.712842 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.712921 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpxwg\" (UniqueName: \"kubernetes.io/projected/5974a9e1-45a2-415c-991d-ccede8264837-kube-api-access-cpxwg\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.713469 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5974a9e1-45a2-415c-991d-ccede8264837-logs\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.718628 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.719263 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-config-data\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.719554 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.734475 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpxwg\" (UniqueName: \"kubernetes.io/projected/5974a9e1-45a2-415c-991d-ccede8264837-kube-api-access-cpxwg\") pod \"nova-metadata-0\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " pod="openstack/nova-metadata-0" Dec 17 09:26:29 crc kubenswrapper[4935]: I1217 09:26:29.840860 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:26:30 crc kubenswrapper[4935]: I1217 09:26:30.320946 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:30 crc kubenswrapper[4935]: I1217 09:26:30.443569 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5974a9e1-45a2-415c-991d-ccede8264837","Type":"ContainerStarted","Data":"ddf0954ea371671f3e9f17d64691956a8560efef77d267dca136f3729c6bcf59"} Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.136981 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46c11c10-4947-468c-b8d7-26ad69885f7e" path="/var/lib/kubelet/pods/46c11c10-4947-468c-b8d7-26ad69885f7e/volumes" Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.458429 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5974a9e1-45a2-415c-991d-ccede8264837","Type":"ContainerStarted","Data":"d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61"} Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.458486 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5974a9e1-45a2-415c-991d-ccede8264837","Type":"ContainerStarted","Data":"233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d"} Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.492185 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.492162274 podStartE2EDuration="2.492162274s" podCreationTimestamp="2025-12-17 09:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:31.480516042 +0000 UTC m=+1311.140356805" watchObservedRunningTime="2025-12-17 09:26:31.492162274 +0000 UTC m=+1311.152003037" Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.518681 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.518763 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.728424 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.728469 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.760757 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.768111 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.802088 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.840039 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-cbvf9"] Dec 17 09:26:31 crc kubenswrapper[4935]: I1217 09:26:31.840592 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" podUID="7ae87ac8-5831-4714-bd2a-806b03f485aa" containerName="dnsmasq-dns" containerID="cri-o://fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84" gracePeriod=10 Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.460211 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.469528 4935 generic.go:334] "Generic (PLEG): container finished" podID="6288f109-f80f-4cd2-a928-914d30835d20" containerID="324a498ea9a803deea163bfc5378483bbb68558bc64874d4b0ec42f82ff2b312" exitCode=0 Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.469620 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xb85g" event={"ID":"6288f109-f80f-4cd2-a928-914d30835d20","Type":"ContainerDied","Data":"324a498ea9a803deea163bfc5378483bbb68558bc64874d4b0ec42f82ff2b312"} Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.471901 4935 generic.go:334] "Generic (PLEG): container finished" podID="7ae87ac8-5831-4714-bd2a-806b03f485aa" containerID="fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84" exitCode=0 Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.471945 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" event={"ID":"7ae87ac8-5831-4714-bd2a-806b03f485aa","Type":"ContainerDied","Data":"fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84"} Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.471965 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" event={"ID":"7ae87ac8-5831-4714-bd2a-806b03f485aa","Type":"ContainerDied","Data":"50a805a55c2291130e5ebd5f37cd122970e0332f1fb47459be872f8a8e84de64"} Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.471983 4935 scope.go:117] "RemoveContainer" containerID="fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.472121 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-cbvf9" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.474540 4935 generic.go:334] "Generic (PLEG): container finished" podID="a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06" containerID="94a2119c0b9f91dc3fdd481b82c2852972ff9c3e055a8ccd82efc3f67a18b281" exitCode=0 Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.474618 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xbvz5" event={"ID":"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06","Type":"ContainerDied","Data":"94a2119c0b9f91dc3fdd481b82c2852972ff9c3e055a8ccd82efc3f67a18b281"} Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.492793 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-svc\") pod \"7ae87ac8-5831-4714-bd2a-806b03f485aa\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.493111 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-config\") pod \"7ae87ac8-5831-4714-bd2a-806b03f485aa\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.493229 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-sb\") pod \"7ae87ac8-5831-4714-bd2a-806b03f485aa\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.493358 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncnr5\" (UniqueName: \"kubernetes.io/projected/7ae87ac8-5831-4714-bd2a-806b03f485aa-kube-api-access-ncnr5\") pod \"7ae87ac8-5831-4714-bd2a-806b03f485aa\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.493426 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-swift-storage-0\") pod \"7ae87ac8-5831-4714-bd2a-806b03f485aa\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.493510 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-nb\") pod \"7ae87ac8-5831-4714-bd2a-806b03f485aa\" (UID: \"7ae87ac8-5831-4714-bd2a-806b03f485aa\") " Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.528721 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ae87ac8-5831-4714-bd2a-806b03f485aa-kube-api-access-ncnr5" (OuterVolumeSpecName: "kube-api-access-ncnr5") pod "7ae87ac8-5831-4714-bd2a-806b03f485aa" (UID: "7ae87ac8-5831-4714-bd2a-806b03f485aa"). InnerVolumeSpecName "kube-api-access-ncnr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.545369 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.554214 4935 scope.go:117] "RemoveContainer" containerID="f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.603609 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncnr5\" (UniqueName: \"kubernetes.io/projected/7ae87ac8-5831-4714-bd2a-806b03f485aa-kube-api-access-ncnr5\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.612619 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.612726 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.622181 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ae87ac8-5831-4714-bd2a-806b03f485aa" (UID: "7ae87ac8-5831-4714-bd2a-806b03f485aa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.623776 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7ae87ac8-5831-4714-bd2a-806b03f485aa" (UID: "7ae87ac8-5831-4714-bd2a-806b03f485aa"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.627123 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-config" (OuterVolumeSpecName: "config") pod "7ae87ac8-5831-4714-bd2a-806b03f485aa" (UID: "7ae87ac8-5831-4714-bd2a-806b03f485aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.628217 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ae87ac8-5831-4714-bd2a-806b03f485aa" (UID: "7ae87ac8-5831-4714-bd2a-806b03f485aa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.664040 4935 scope.go:117] "RemoveContainer" containerID="fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84" Dec 17 09:26:32 crc kubenswrapper[4935]: E1217 09:26:32.664762 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84\": container with ID starting with fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84 not found: ID does not exist" containerID="fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.664806 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84"} err="failed to get container status \"fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84\": rpc error: code = NotFound desc = could not find container \"fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84\": container with ID starting with fa3d5d5bfca60161618bec79eb98a89d0813b8578326cd1a1a2e5abbabddfc84 not found: ID does not exist" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.664834 4935 scope.go:117] "RemoveContainer" containerID="f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3" Dec 17 09:26:32 crc kubenswrapper[4935]: E1217 09:26:32.665895 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3\": container with ID starting with f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3 not found: ID does not exist" containerID="f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.665985 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3"} err="failed to get container status \"f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3\": rpc error: code = NotFound desc = could not find container \"f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3\": container with ID starting with f7d75a772257d5f60a06fafd4ef6ea0f7be20edf9392f037cc5b8beab2b926a3 not found: ID does not exist" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.675574 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ae87ac8-5831-4714-bd2a-806b03f485aa" (UID: "7ae87ac8-5831-4714-bd2a-806b03f485aa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.706118 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.706188 4935 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.706200 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.706234 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.706243 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ae87ac8-5831-4714-bd2a-806b03f485aa-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.865712 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-cbvf9"] Dec 17 09:26:32 crc kubenswrapper[4935]: I1217 09:26:32.875407 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-cbvf9"] Dec 17 09:26:33 crc kubenswrapper[4935]: I1217 09:26:33.141003 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ae87ac8-5831-4714-bd2a-806b03f485aa" path="/var/lib/kubelet/pods/7ae87ac8-5831-4714-bd2a-806b03f485aa/volumes" Dec 17 09:26:33 crc kubenswrapper[4935]: I1217 09:26:33.977174 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:33.999370 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.049696 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-config-data\") pod \"6288f109-f80f-4cd2-a928-914d30835d20\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.049768 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-combined-ca-bundle\") pod \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.049893 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs784\" (UniqueName: \"kubernetes.io/projected/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-kube-api-access-xs784\") pod \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.049969 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v774l\" (UniqueName: \"kubernetes.io/projected/6288f109-f80f-4cd2-a928-914d30835d20-kube-api-access-v774l\") pod \"6288f109-f80f-4cd2-a928-914d30835d20\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.049997 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-scripts\") pod \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.050029 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-scripts\") pod \"6288f109-f80f-4cd2-a928-914d30835d20\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.050161 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-config-data\") pod \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\" (UID: \"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06\") " Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.050217 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-combined-ca-bundle\") pod \"6288f109-f80f-4cd2-a928-914d30835d20\" (UID: \"6288f109-f80f-4cd2-a928-914d30835d20\") " Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.064506 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-scripts" (OuterVolumeSpecName: "scripts") pod "a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06" (UID: "a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.071452 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-kube-api-access-xs784" (OuterVolumeSpecName: "kube-api-access-xs784") pod "a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06" (UID: "a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06"). InnerVolumeSpecName "kube-api-access-xs784". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.085757 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6288f109-f80f-4cd2-a928-914d30835d20-kube-api-access-v774l" (OuterVolumeSpecName: "kube-api-access-v774l") pod "6288f109-f80f-4cd2-a928-914d30835d20" (UID: "6288f109-f80f-4cd2-a928-914d30835d20"). InnerVolumeSpecName "kube-api-access-v774l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.094096 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-scripts" (OuterVolumeSpecName: "scripts") pod "6288f109-f80f-4cd2-a928-914d30835d20" (UID: "6288f109-f80f-4cd2-a928-914d30835d20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.097148 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6288f109-f80f-4cd2-a928-914d30835d20" (UID: "6288f109-f80f-4cd2-a928-914d30835d20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.114790 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-config-data" (OuterVolumeSpecName: "config-data") pod "6288f109-f80f-4cd2-a928-914d30835d20" (UID: "6288f109-f80f-4cd2-a928-914d30835d20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.116717 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-config-data" (OuterVolumeSpecName: "config-data") pod "a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06" (UID: "a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.128507 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06" (UID: "a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.154797 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.155196 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.155255 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.155292 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.155306 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs784\" (UniqueName: \"kubernetes.io/projected/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-kube-api-access-xs784\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.155321 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v774l\" (UniqueName: \"kubernetes.io/projected/6288f109-f80f-4cd2-a928-914d30835d20-kube-api-access-v774l\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.155332 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.155345 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6288f109-f80f-4cd2-a928-914d30835d20-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.499770 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xb85g" event={"ID":"6288f109-f80f-4cd2-a928-914d30835d20","Type":"ContainerDied","Data":"06b45fe5fb7cff5d72c49d3fd5016e74591c646a6e1bfb781e1c841c06161fcb"} Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.499833 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06b45fe5fb7cff5d72c49d3fd5016e74591c646a6e1bfb781e1c841c06161fcb" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.499934 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xb85g" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.504917 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xbvz5" event={"ID":"a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06","Type":"ContainerDied","Data":"296d4523f8cfabffefddc6a1b6f56b1e02168e289ea1e05a301217e2594478df"} Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.504982 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="296d4523f8cfabffefddc6a1b6f56b1e02168e289ea1e05a301217e2594478df" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.505095 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xbvz5" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.628592 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 17 09:26:34 crc kubenswrapper[4935]: E1217 09:26:34.629178 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae87ac8-5831-4714-bd2a-806b03f485aa" containerName="dnsmasq-dns" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.629197 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae87ac8-5831-4714-bd2a-806b03f485aa" containerName="dnsmasq-dns" Dec 17 09:26:34 crc kubenswrapper[4935]: E1217 09:26:34.629212 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6288f109-f80f-4cd2-a928-914d30835d20" containerName="nova-cell1-conductor-db-sync" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.629220 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6288f109-f80f-4cd2-a928-914d30835d20" containerName="nova-cell1-conductor-db-sync" Dec 17 09:26:34 crc kubenswrapper[4935]: E1217 09:26:34.629235 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06" containerName="nova-manage" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.629244 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06" containerName="nova-manage" Dec 17 09:26:34 crc kubenswrapper[4935]: E1217 09:26:34.629336 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ae87ac8-5831-4714-bd2a-806b03f485aa" containerName="init" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.629345 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ae87ac8-5831-4714-bd2a-806b03f485aa" containerName="init" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.629560 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6288f109-f80f-4cd2-a928-914d30835d20" containerName="nova-cell1-conductor-db-sync" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.629588 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ae87ac8-5831-4714-bd2a-806b03f485aa" containerName="dnsmasq-dns" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.629611 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06" containerName="nova-manage" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.630662 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.641357 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.642073 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.667350 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e0cb55-6351-4230-bfd7-e89a1439df97-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"84e0cb55-6351-4230-bfd7-e89a1439df97\") " pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.667405 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqbq\" (UniqueName: \"kubernetes.io/projected/84e0cb55-6351-4230-bfd7-e89a1439df97-kube-api-access-xxqbq\") pod \"nova-cell1-conductor-0\" (UID: \"84e0cb55-6351-4230-bfd7-e89a1439df97\") " pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.667496 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e0cb55-6351-4230-bfd7-e89a1439df97-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"84e0cb55-6351-4230-bfd7-e89a1439df97\") " pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.769439 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e0cb55-6351-4230-bfd7-e89a1439df97-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"84e0cb55-6351-4230-bfd7-e89a1439df97\") " pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.769843 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e0cb55-6351-4230-bfd7-e89a1439df97-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"84e0cb55-6351-4230-bfd7-e89a1439df97\") " pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.769893 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqbq\" (UniqueName: \"kubernetes.io/projected/84e0cb55-6351-4230-bfd7-e89a1439df97-kube-api-access-xxqbq\") pod \"nova-cell1-conductor-0\" (UID: \"84e0cb55-6351-4230-bfd7-e89a1439df97\") " pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.783215 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84e0cb55-6351-4230-bfd7-e89a1439df97-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"84e0cb55-6351-4230-bfd7-e89a1439df97\") " pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.783303 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84e0cb55-6351-4230-bfd7-e89a1439df97-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"84e0cb55-6351-4230-bfd7-e89a1439df97\") " pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.787056 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqbq\" (UniqueName: \"kubernetes.io/projected/84e0cb55-6351-4230-bfd7-e89a1439df97-kube-api-access-xxqbq\") pod \"nova-cell1-conductor-0\" (UID: \"84e0cb55-6351-4230-bfd7-e89a1439df97\") " pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.829782 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.830111 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerName="nova-api-log" containerID="cri-o://a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9" gracePeriod=30 Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.830195 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerName="nova-api-api" containerID="cri-o://7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3" gracePeriod=30 Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.841188 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.843435 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.893047 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.893594 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f53c7404-83eb-440f-8007-2383fff771c1" containerName="nova-scheduler-scheduler" containerID="cri-o://683f5c56bb0ec2f457e238de44435d8b767fd3f52080670f29eb4539fafd6387" gracePeriod=30 Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.905223 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:34 crc kubenswrapper[4935]: I1217 09:26:34.952655 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:35 crc kubenswrapper[4935]: I1217 09:26:35.451839 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 17 09:26:35 crc kubenswrapper[4935]: W1217 09:26:35.462799 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84e0cb55_6351_4230_bfd7_e89a1439df97.slice/crio-724d8189eec4cecb0ad3cc128f5ebb4c8b45bb8b975431e974491bd31ec3304e WatchSource:0}: Error finding container 724d8189eec4cecb0ad3cc128f5ebb4c8b45bb8b975431e974491bd31ec3304e: Status 404 returned error can't find the container with id 724d8189eec4cecb0ad3cc128f5ebb4c8b45bb8b975431e974491bd31ec3304e Dec 17 09:26:35 crc kubenswrapper[4935]: I1217 09:26:35.516321 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"84e0cb55-6351-4230-bfd7-e89a1439df97","Type":"ContainerStarted","Data":"724d8189eec4cecb0ad3cc128f5ebb4c8b45bb8b975431e974491bd31ec3304e"} Dec 17 09:26:35 crc kubenswrapper[4935]: I1217 09:26:35.518490 4935 generic.go:334] "Generic (PLEG): container finished" podID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerID="a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9" exitCode=143 Dec 17 09:26:35 crc kubenswrapper[4935]: I1217 09:26:35.518569 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c2e0ef5-7511-4b15-a6ee-2102084a5175","Type":"ContainerDied","Data":"a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9"} Dec 17 09:26:36 crc kubenswrapper[4935]: I1217 09:26:36.529620 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"84e0cb55-6351-4230-bfd7-e89a1439df97","Type":"ContainerStarted","Data":"60a1bd45b5cac47c9904114ff9f140df79644099e24b1f7978daa3b3e973e1df"} Dec 17 09:26:36 crc kubenswrapper[4935]: I1217 09:26:36.529675 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5974a9e1-45a2-415c-991d-ccede8264837" containerName="nova-metadata-log" containerID="cri-o://233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d" gracePeriod=30 Dec 17 09:26:36 crc kubenswrapper[4935]: I1217 09:26:36.530312 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5974a9e1-45a2-415c-991d-ccede8264837" containerName="nova-metadata-metadata" containerID="cri-o://d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61" gracePeriod=30 Dec 17 09:26:36 crc kubenswrapper[4935]: I1217 09:26:36.553987 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.553957644 podStartE2EDuration="2.553957644s" podCreationTimestamp="2025-12-17 09:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:36.551246619 +0000 UTC m=+1316.211087382" watchObservedRunningTime="2025-12-17 09:26:36.553957644 +0000 UTC m=+1316.213798407" Dec 17 09:26:36 crc kubenswrapper[4935]: E1217 09:26:36.730510 4935 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="683f5c56bb0ec2f457e238de44435d8b767fd3f52080670f29eb4539fafd6387" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 17 09:26:36 crc kubenswrapper[4935]: E1217 09:26:36.732518 4935 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="683f5c56bb0ec2f457e238de44435d8b767fd3f52080670f29eb4539fafd6387" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 17 09:26:36 crc kubenswrapper[4935]: E1217 09:26:36.733988 4935 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="683f5c56bb0ec2f457e238de44435d8b767fd3f52080670f29eb4539fafd6387" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 17 09:26:36 crc kubenswrapper[4935]: E1217 09:26:36.734038 4935 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f53c7404-83eb-440f-8007-2383fff771c1" containerName="nova-scheduler-scheduler" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.104088 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.127777 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpxwg\" (UniqueName: \"kubernetes.io/projected/5974a9e1-45a2-415c-991d-ccede8264837-kube-api-access-cpxwg\") pod \"5974a9e1-45a2-415c-991d-ccede8264837\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.127927 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5974a9e1-45a2-415c-991d-ccede8264837-logs\") pod \"5974a9e1-45a2-415c-991d-ccede8264837\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.127989 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-config-data\") pod \"5974a9e1-45a2-415c-991d-ccede8264837\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.128134 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-nova-metadata-tls-certs\") pod \"5974a9e1-45a2-415c-991d-ccede8264837\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.128237 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-combined-ca-bundle\") pod \"5974a9e1-45a2-415c-991d-ccede8264837\" (UID: \"5974a9e1-45a2-415c-991d-ccede8264837\") " Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.128802 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5974a9e1-45a2-415c-991d-ccede8264837-logs" (OuterVolumeSpecName: "logs") pod "5974a9e1-45a2-415c-991d-ccede8264837" (UID: "5974a9e1-45a2-415c-991d-ccede8264837"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.129167 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5974a9e1-45a2-415c-991d-ccede8264837-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.150525 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5974a9e1-45a2-415c-991d-ccede8264837-kube-api-access-cpxwg" (OuterVolumeSpecName: "kube-api-access-cpxwg") pod "5974a9e1-45a2-415c-991d-ccede8264837" (UID: "5974a9e1-45a2-415c-991d-ccede8264837"). InnerVolumeSpecName "kube-api-access-cpxwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.174913 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5974a9e1-45a2-415c-991d-ccede8264837" (UID: "5974a9e1-45a2-415c-991d-ccede8264837"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.186402 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-config-data" (OuterVolumeSpecName: "config-data") pod "5974a9e1-45a2-415c-991d-ccede8264837" (UID: "5974a9e1-45a2-415c-991d-ccede8264837"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.228856 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5974a9e1-45a2-415c-991d-ccede8264837" (UID: "5974a9e1-45a2-415c-991d-ccede8264837"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.232107 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.232148 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpxwg\" (UniqueName: \"kubernetes.io/projected/5974a9e1-45a2-415c-991d-ccede8264837-kube-api-access-cpxwg\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.232163 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.232172 4935 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5974a9e1-45a2-415c-991d-ccede8264837-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.543923 4935 generic.go:334] "Generic (PLEG): container finished" podID="5974a9e1-45a2-415c-991d-ccede8264837" containerID="d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61" exitCode=0 Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.543978 4935 generic.go:334] "Generic (PLEG): container finished" podID="5974a9e1-45a2-415c-991d-ccede8264837" containerID="233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d" exitCode=143 Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.544041 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.544123 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5974a9e1-45a2-415c-991d-ccede8264837","Type":"ContainerDied","Data":"d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61"} Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.544168 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5974a9e1-45a2-415c-991d-ccede8264837","Type":"ContainerDied","Data":"233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d"} Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.544183 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5974a9e1-45a2-415c-991d-ccede8264837","Type":"ContainerDied","Data":"ddf0954ea371671f3e9f17d64691956a8560efef77d267dca136f3729c6bcf59"} Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.544207 4935 scope.go:117] "RemoveContainer" containerID="d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.545028 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.579709 4935 scope.go:117] "RemoveContainer" containerID="233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.590508 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.603732 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.606553 4935 scope.go:117] "RemoveContainer" containerID="d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61" Dec 17 09:26:37 crc kubenswrapper[4935]: E1217 09:26:37.607310 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61\": container with ID starting with d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61 not found: ID does not exist" containerID="d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.607373 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61"} err="failed to get container status \"d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61\": rpc error: code = NotFound desc = could not find container \"d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61\": container with ID starting with d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61 not found: ID does not exist" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.607412 4935 scope.go:117] "RemoveContainer" containerID="233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d" Dec 17 09:26:37 crc kubenswrapper[4935]: E1217 09:26:37.607792 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d\": container with ID starting with 233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d not found: ID does not exist" containerID="233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.607854 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d"} err="failed to get container status \"233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d\": rpc error: code = NotFound desc = could not find container \"233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d\": container with ID starting with 233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d not found: ID does not exist" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.607889 4935 scope.go:117] "RemoveContainer" containerID="d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.608152 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61"} err="failed to get container status \"d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61\": rpc error: code = NotFound desc = could not find container \"d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61\": container with ID starting with d9cfb6110807a3c199e423a0d358f5d1c93e29ca79d45c815e661fb334c94e61 not found: ID does not exist" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.608179 4935 scope.go:117] "RemoveContainer" containerID="233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.608454 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d"} err="failed to get container status \"233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d\": rpc error: code = NotFound desc = could not find container \"233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d\": container with ID starting with 233ecdf552f8dfca670f18d2b92ba92ad36dcbd36ba6b1258e14c72dcebf848d not found: ID does not exist" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.624618 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:37 crc kubenswrapper[4935]: E1217 09:26:37.625212 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5974a9e1-45a2-415c-991d-ccede8264837" containerName="nova-metadata-log" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.625244 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5974a9e1-45a2-415c-991d-ccede8264837" containerName="nova-metadata-log" Dec 17 09:26:37 crc kubenswrapper[4935]: E1217 09:26:37.625336 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5974a9e1-45a2-415c-991d-ccede8264837" containerName="nova-metadata-metadata" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.625351 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5974a9e1-45a2-415c-991d-ccede8264837" containerName="nova-metadata-metadata" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.625598 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5974a9e1-45a2-415c-991d-ccede8264837" containerName="nova-metadata-log" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.625625 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5974a9e1-45a2-415c-991d-ccede8264837" containerName="nova-metadata-metadata" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.627217 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.632474 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.636703 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.646621 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.747045 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.747257 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.747321 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-config-data\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.747357 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59489e01-2f78-426d-b617-be609f6538f2-logs\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.747393 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxxlp\" (UniqueName: \"kubernetes.io/projected/59489e01-2f78-426d-b617-be609f6538f2-kube-api-access-wxxlp\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.850529 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxxlp\" (UniqueName: \"kubernetes.io/projected/59489e01-2f78-426d-b617-be609f6538f2-kube-api-access-wxxlp\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.850585 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.850717 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.850756 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-config-data\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.850799 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59489e01-2f78-426d-b617-be609f6538f2-logs\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.855463 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59489e01-2f78-426d-b617-be609f6538f2-logs\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.856339 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.857343 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.857863 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-config-data\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:37 crc kubenswrapper[4935]: I1217 09:26:37.871827 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxxlp\" (UniqueName: \"kubernetes.io/projected/59489e01-2f78-426d-b617-be609f6538f2-kube-api-access-wxxlp\") pod \"nova-metadata-0\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " pod="openstack/nova-metadata-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.008737 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.434379 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.470197 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-combined-ca-bundle\") pod \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.470499 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88dnv\" (UniqueName: \"kubernetes.io/projected/0c2e0ef5-7511-4b15-a6ee-2102084a5175-kube-api-access-88dnv\") pod \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.470646 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-config-data\") pod \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.470691 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c2e0ef5-7511-4b15-a6ee-2102084a5175-logs\") pod \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\" (UID: \"0c2e0ef5-7511-4b15-a6ee-2102084a5175\") " Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.478005 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c2e0ef5-7511-4b15-a6ee-2102084a5175-logs" (OuterVolumeSpecName: "logs") pod "0c2e0ef5-7511-4b15-a6ee-2102084a5175" (UID: "0c2e0ef5-7511-4b15-a6ee-2102084a5175"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.505353 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c2e0ef5-7511-4b15-a6ee-2102084a5175-kube-api-access-88dnv" (OuterVolumeSpecName: "kube-api-access-88dnv") pod "0c2e0ef5-7511-4b15-a6ee-2102084a5175" (UID: "0c2e0ef5-7511-4b15-a6ee-2102084a5175"). InnerVolumeSpecName "kube-api-access-88dnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.519156 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c2e0ef5-7511-4b15-a6ee-2102084a5175" (UID: "0c2e0ef5-7511-4b15-a6ee-2102084a5175"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.521621 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-config-data" (OuterVolumeSpecName: "config-data") pod "0c2e0ef5-7511-4b15-a6ee-2102084a5175" (UID: "0c2e0ef5-7511-4b15-a6ee-2102084a5175"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.558071 4935 generic.go:334] "Generic (PLEG): container finished" podID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerID="7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3" exitCode=0 Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.559227 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c2e0ef5-7511-4b15-a6ee-2102084a5175","Type":"ContainerDied","Data":"7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3"} Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.559305 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c2e0ef5-7511-4b15-a6ee-2102084a5175","Type":"ContainerDied","Data":"f8c544715a3d0d35c29a314bb8783ca9bd38158a7ac0ccd37b842bfdd23f739d"} Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.559330 4935 scope.go:117] "RemoveContainer" containerID="7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.558213 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.578714 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.578751 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c2e0ef5-7511-4b15-a6ee-2102084a5175-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.578764 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c2e0ef5-7511-4b15-a6ee-2102084a5175-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.578775 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88dnv\" (UniqueName: \"kubernetes.io/projected/0c2e0ef5-7511-4b15-a6ee-2102084a5175-kube-api-access-88dnv\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.601233 4935 scope.go:117] "RemoveContainer" containerID="a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.635745 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.657546 4935 scope.go:117] "RemoveContainer" containerID="7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3" Dec 17 09:26:38 crc kubenswrapper[4935]: E1217 09:26:38.659820 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3\": container with ID starting with 7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3 not found: ID does not exist" containerID="7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.659906 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3"} err="failed to get container status \"7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3\": rpc error: code = NotFound desc = could not find container \"7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3\": container with ID starting with 7809edf03e8e0d0e4ef1d07e7fcc75a8c3ba9d5cc7bd8ea9f01dcbbe125bfbb3 not found: ID does not exist" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.659960 4935 scope.go:117] "RemoveContainer" containerID="a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9" Dec 17 09:26:38 crc kubenswrapper[4935]: E1217 09:26:38.664534 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9\": container with ID starting with a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9 not found: ID does not exist" containerID="a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.664616 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9"} err="failed to get container status \"a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9\": rpc error: code = NotFound desc = could not find container \"a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9\": container with ID starting with a4b1c8262865374f108ae71bc4ffdf5ecfe339bb1dc13864b9c64865333210b9 not found: ID does not exist" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.664692 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.678402 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 17 09:26:38 crc kubenswrapper[4935]: E1217 09:26:38.678920 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerName="nova-api-log" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.678943 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerName="nova-api-log" Dec 17 09:26:38 crc kubenswrapper[4935]: E1217 09:26:38.678955 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerName="nova-api-api" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.678963 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerName="nova-api-api" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.679154 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerName="nova-api-api" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.679181 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" containerName="nova-api-log" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.682653 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.687676 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.694381 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.710448 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.785239 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.785385 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-config-data\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.785500 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-logs\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.786257 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnj7\" (UniqueName: \"kubernetes.io/projected/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-kube-api-access-qfnj7\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.889019 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-logs\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.889161 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnj7\" (UniqueName: \"kubernetes.io/projected/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-kube-api-access-qfnj7\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.889239 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.889418 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-config-data\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.890549 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-logs\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.897428 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-config-data\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.898727 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:38 crc kubenswrapper[4935]: I1217 09:26:38.916198 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnj7\" (UniqueName: \"kubernetes.io/projected/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-kube-api-access-qfnj7\") pod \"nova-api-0\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " pod="openstack/nova-api-0" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.055660 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.142165 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c2e0ef5-7511-4b15-a6ee-2102084a5175" path="/var/lib/kubelet/pods/0c2e0ef5-7511-4b15-a6ee-2102084a5175/volumes" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.143000 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5974a9e1-45a2-415c-991d-ccede8264837" path="/var/lib/kubelet/pods/5974a9e1-45a2-415c-991d-ccede8264837/volumes" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.560188 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.607646 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59489e01-2f78-426d-b617-be609f6538f2","Type":"ContainerStarted","Data":"44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980"} Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.607728 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59489e01-2f78-426d-b617-be609f6538f2","Type":"ContainerStarted","Data":"206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d"} Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.607748 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59489e01-2f78-426d-b617-be609f6538f2","Type":"ContainerStarted","Data":"da5923fa99a2ef871a3bec99b1acc03a8e3bb6e2309751f4d04203f52a69c8e0"} Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.609787 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.611787 4935 generic.go:334] "Generic (PLEG): container finished" podID="f53c7404-83eb-440f-8007-2383fff771c1" containerID="683f5c56bb0ec2f457e238de44435d8b767fd3f52080670f29eb4539fafd6387" exitCode=0 Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.611868 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f53c7404-83eb-440f-8007-2383fff771c1","Type":"ContainerDied","Data":"683f5c56bb0ec2f457e238de44435d8b767fd3f52080670f29eb4539fafd6387"} Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.612065 4935 scope.go:117] "RemoveContainer" containerID="683f5c56bb0ec2f457e238de44435d8b767fd3f52080670f29eb4539fafd6387" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.617716 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5","Type":"ContainerStarted","Data":"e0b0a84400db3be185b1c683b350ffadeb1530ea5dc49a4c2ca207b0de1e6583"} Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.685179 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.6851424010000002 podStartE2EDuration="2.685142401s" podCreationTimestamp="2025-12-17 09:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:39.638654995 +0000 UTC m=+1319.298495758" watchObservedRunningTime="2025-12-17 09:26:39.685142401 +0000 UTC m=+1319.344983174" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.709870 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk2n5\" (UniqueName: \"kubernetes.io/projected/f53c7404-83eb-440f-8007-2383fff771c1-kube-api-access-hk2n5\") pod \"f53c7404-83eb-440f-8007-2383fff771c1\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.710021 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-config-data\") pod \"f53c7404-83eb-440f-8007-2383fff771c1\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.710096 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-combined-ca-bundle\") pod \"f53c7404-83eb-440f-8007-2383fff771c1\" (UID: \"f53c7404-83eb-440f-8007-2383fff771c1\") " Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.723238 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53c7404-83eb-440f-8007-2383fff771c1-kube-api-access-hk2n5" (OuterVolumeSpecName: "kube-api-access-hk2n5") pod "f53c7404-83eb-440f-8007-2383fff771c1" (UID: "f53c7404-83eb-440f-8007-2383fff771c1"). InnerVolumeSpecName "kube-api-access-hk2n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.753475 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f53c7404-83eb-440f-8007-2383fff771c1" (UID: "f53c7404-83eb-440f-8007-2383fff771c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.765538 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-config-data" (OuterVolumeSpecName: "config-data") pod "f53c7404-83eb-440f-8007-2383fff771c1" (UID: "f53c7404-83eb-440f-8007-2383fff771c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.812503 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk2n5\" (UniqueName: \"kubernetes.io/projected/f53c7404-83eb-440f-8007-2383fff771c1-kube-api-access-hk2n5\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.812544 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:39 crc kubenswrapper[4935]: I1217 09:26:39.812554 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f53c7404-83eb-440f-8007-2383fff771c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.388054 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.630421 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5","Type":"ContainerStarted","Data":"f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110"} Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.631007 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5","Type":"ContainerStarted","Data":"d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc"} Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.635186 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.635317 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f53c7404-83eb-440f-8007-2383fff771c1","Type":"ContainerDied","Data":"5e8f7aaf6b27f1dac54284b6160ab3f84d7ccda0b8e5868bc2931aef18ffeb38"} Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.661440 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.661406538 podStartE2EDuration="2.661406538s" podCreationTimestamp="2025-12-17 09:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:40.656259164 +0000 UTC m=+1320.316099937" watchObservedRunningTime="2025-12-17 09:26:40.661406538 +0000 UTC m=+1320.321247301" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.706670 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.730660 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.773508 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:26:40 crc kubenswrapper[4935]: E1217 09:26:40.774186 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53c7404-83eb-440f-8007-2383fff771c1" containerName="nova-scheduler-scheduler" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.774221 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53c7404-83eb-440f-8007-2383fff771c1" containerName="nova-scheduler-scheduler" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.774473 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53c7404-83eb-440f-8007-2383fff771c1" containerName="nova-scheduler-scheduler" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.775585 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.779367 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.802063 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.847702 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-config-data\") pod \"nova-scheduler-0\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.847761 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.847882 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t746p\" (UniqueName: \"kubernetes.io/projected/44959a00-28e4-4914-8596-cf160689ff68-kube-api-access-t746p\") pod \"nova-scheduler-0\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.950441 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-config-data\") pod \"nova-scheduler-0\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.950488 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.950601 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t746p\" (UniqueName: \"kubernetes.io/projected/44959a00-28e4-4914-8596-cf160689ff68-kube-api-access-t746p\") pod \"nova-scheduler-0\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.962636 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-config-data\") pod \"nova-scheduler-0\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.964233 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:40 crc kubenswrapper[4935]: I1217 09:26:40.972056 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t746p\" (UniqueName: \"kubernetes.io/projected/44959a00-28e4-4914-8596-cf160689ff68-kube-api-access-t746p\") pod \"nova-scheduler-0\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " pod="openstack/nova-scheduler-0" Dec 17 09:26:41 crc kubenswrapper[4935]: I1217 09:26:41.104213 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 17 09:26:41 crc kubenswrapper[4935]: I1217 09:26:41.141504 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53c7404-83eb-440f-8007-2383fff771c1" path="/var/lib/kubelet/pods/f53c7404-83eb-440f-8007-2383fff771c1/volumes" Dec 17 09:26:41 crc kubenswrapper[4935]: I1217 09:26:41.597511 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:26:41 crc kubenswrapper[4935]: W1217 09:26:41.600706 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44959a00_28e4_4914_8596_cf160689ff68.slice/crio-607429fc6d1cb88556972b0a83e0a41615e76445c479ac4fffb1408aeaca95e4 WatchSource:0}: Error finding container 607429fc6d1cb88556972b0a83e0a41615e76445c479ac4fffb1408aeaca95e4: Status 404 returned error can't find the container with id 607429fc6d1cb88556972b0a83e0a41615e76445c479ac4fffb1408aeaca95e4 Dec 17 09:26:41 crc kubenswrapper[4935]: I1217 09:26:41.662485 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"44959a00-28e4-4914-8596-cf160689ff68","Type":"ContainerStarted","Data":"607429fc6d1cb88556972b0a83e0a41615e76445c479ac4fffb1408aeaca95e4"} Dec 17 09:26:42 crc kubenswrapper[4935]: I1217 09:26:42.699950 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"44959a00-28e4-4914-8596-cf160689ff68","Type":"ContainerStarted","Data":"f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233"} Dec 17 09:26:42 crc kubenswrapper[4935]: I1217 09:26:42.734014 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.733988203 podStartE2EDuration="2.733988203s" podCreationTimestamp="2025-12-17 09:26:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:26:42.724338309 +0000 UTC m=+1322.384179082" watchObservedRunningTime="2025-12-17 09:26:42.733988203 +0000 UTC m=+1322.393828966" Dec 17 09:26:43 crc kubenswrapper[4935]: I1217 09:26:43.009635 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 17 09:26:43 crc kubenswrapper[4935]: I1217 09:26:43.010829 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 17 09:26:44 crc kubenswrapper[4935]: I1217 09:26:44.742549 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 17 09:26:44 crc kubenswrapper[4935]: I1217 09:26:44.744251 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4a4755e9-435f-4abd-b10f-1f55f1bc4d17" containerName="kube-state-metrics" containerID="cri-o://b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756" gracePeriod=30 Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.044019 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.334581 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.358220 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdnfn\" (UniqueName: \"kubernetes.io/projected/4a4755e9-435f-4abd-b10f-1f55f1bc4d17-kube-api-access-zdnfn\") pod \"4a4755e9-435f-4abd-b10f-1f55f1bc4d17\" (UID: \"4a4755e9-435f-4abd-b10f-1f55f1bc4d17\") " Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.372055 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a4755e9-435f-4abd-b10f-1f55f1bc4d17-kube-api-access-zdnfn" (OuterVolumeSpecName: "kube-api-access-zdnfn") pod "4a4755e9-435f-4abd-b10f-1f55f1bc4d17" (UID: "4a4755e9-435f-4abd-b10f-1f55f1bc4d17"). InnerVolumeSpecName "kube-api-access-zdnfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.461927 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdnfn\" (UniqueName: \"kubernetes.io/projected/4a4755e9-435f-4abd-b10f-1f55f1bc4d17-kube-api-access-zdnfn\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.733467 4935 generic.go:334] "Generic (PLEG): container finished" podID="4a4755e9-435f-4abd-b10f-1f55f1bc4d17" containerID="b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756" exitCode=2 Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.733538 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a4755e9-435f-4abd-b10f-1f55f1bc4d17","Type":"ContainerDied","Data":"b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756"} Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.733566 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.733597 4935 scope.go:117] "RemoveContainer" containerID="b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.733580 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4a4755e9-435f-4abd-b10f-1f55f1bc4d17","Type":"ContainerDied","Data":"c5679fd22c04640ffade78a3d738cbfa0acb864d4cc849a2f208da1233ba2e7c"} Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.803030 4935 scope.go:117] "RemoveContainer" containerID="b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756" Dec 17 09:26:45 crc kubenswrapper[4935]: E1217 09:26:45.803580 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756\": container with ID starting with b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756 not found: ID does not exist" containerID="b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.803633 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756"} err="failed to get container status \"b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756\": rpc error: code = NotFound desc = could not find container \"b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756\": container with ID starting with b4e0ab173484ceddc03ff72af5dcdd0618b3cf0073e3ebf3631be02ee2ace756 not found: ID does not exist" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.815028 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.830881 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.851365 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 17 09:26:45 crc kubenswrapper[4935]: E1217 09:26:45.852027 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4755e9-435f-4abd-b10f-1f55f1bc4d17" containerName="kube-state-metrics" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.852050 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4755e9-435f-4abd-b10f-1f55f1bc4d17" containerName="kube-state-metrics" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.852246 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a4755e9-435f-4abd-b10f-1f55f1bc4d17" containerName="kube-state-metrics" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.853132 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.856351 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.858213 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.858460 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.885838 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.886413 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st5m6\" (UniqueName: \"kubernetes.io/projected/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-kube-api-access-st5m6\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.886490 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.886516 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.987190 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.987288 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st5m6\" (UniqueName: \"kubernetes.io/projected/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-kube-api-access-st5m6\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.987353 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.987375 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.992395 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.992675 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:45 crc kubenswrapper[4935]: I1217 09:26:45.992683 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:46 crc kubenswrapper[4935]: I1217 09:26:46.011544 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st5m6\" (UniqueName: \"kubernetes.io/projected/e33889bd-e62d-4b7c-83b1-a2ffc878b85a-kube-api-access-st5m6\") pod \"kube-state-metrics-0\" (UID: \"e33889bd-e62d-4b7c-83b1-a2ffc878b85a\") " pod="openstack/kube-state-metrics-0" Dec 17 09:26:46 crc kubenswrapper[4935]: I1217 09:26:46.106365 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 17 09:26:46 crc kubenswrapper[4935]: I1217 09:26:46.175454 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 17 09:26:46 crc kubenswrapper[4935]: I1217 09:26:46.685200 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 17 09:26:46 crc kubenswrapper[4935]: I1217 09:26:46.748861 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e33889bd-e62d-4b7c-83b1-a2ffc878b85a","Type":"ContainerStarted","Data":"c71755eac8fd49991fbcddbf0c8095b778380a1d067fc615b008c539c3a3abff"} Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.058689 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.059112 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="ceilometer-central-agent" containerID="cri-o://355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0" gracePeriod=30 Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.059643 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="ceilometer-notification-agent" containerID="cri-o://a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee" gracePeriod=30 Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.059763 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="sg-core" containerID="cri-o://f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47" gracePeriod=30 Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.060661 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="proxy-httpd" containerID="cri-o://147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7" gracePeriod=30 Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.137615 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a4755e9-435f-4abd-b10f-1f55f1bc4d17" path="/var/lib/kubelet/pods/4a4755e9-435f-4abd-b10f-1f55f1bc4d17/volumes" Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.760361 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e33889bd-e62d-4b7c-83b1-a2ffc878b85a","Type":"ContainerStarted","Data":"b35dd959a7f13de13af5b693eb05a34d539da8c7f02b3b2adda1be28c86df92e"} Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.760931 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.763818 4935 generic.go:334] "Generic (PLEG): container finished" podID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerID="147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7" exitCode=0 Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.763862 4935 generic.go:334] "Generic (PLEG): container finished" podID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerID="f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47" exitCode=2 Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.763875 4935 generic.go:334] "Generic (PLEG): container finished" podID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerID="355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0" exitCode=0 Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.763913 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec","Type":"ContainerDied","Data":"147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7"} Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.763957 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec","Type":"ContainerDied","Data":"f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47"} Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.763970 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec","Type":"ContainerDied","Data":"355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0"} Dec 17 09:26:47 crc kubenswrapper[4935]: I1217 09:26:47.784837 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.4366281770000002 podStartE2EDuration="2.784796117s" podCreationTimestamp="2025-12-17 09:26:45 +0000 UTC" firstStartedPulling="2025-12-17 09:26:46.671349384 +0000 UTC m=+1326.331190147" lastFinishedPulling="2025-12-17 09:26:47.019517324 +0000 UTC m=+1326.679358087" observedRunningTime="2025-12-17 09:26:47.77836029 +0000 UTC m=+1327.438201053" watchObservedRunningTime="2025-12-17 09:26:47.784796117 +0000 UTC m=+1327.444636920" Dec 17 09:26:48 crc kubenswrapper[4935]: I1217 09:26:48.009193 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 17 09:26:48 crc kubenswrapper[4935]: I1217 09:26:48.009297 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 17 09:26:49 crc kubenswrapper[4935]: I1217 09:26:49.026481 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 17 09:26:49 crc kubenswrapper[4935]: I1217 09:26:49.026501 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 17 09:26:49 crc kubenswrapper[4935]: I1217 09:26:49.056522 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 17 09:26:49 crc kubenswrapper[4935]: I1217 09:26:49.056607 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.140618 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.140645 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.580582 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.738974 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-config-data\") pod \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.739193 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-sg-core-conf-yaml\") pod \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.739300 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-scripts\") pod \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.739410 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-run-httpd\") pod \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.739432 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-combined-ca-bundle\") pod \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.739483 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-log-httpd\") pod \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.739545 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lxr5\" (UniqueName: \"kubernetes.io/projected/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-kube-api-access-7lxr5\") pod \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\" (UID: \"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec\") " Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.739900 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" (UID: "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.740135 4935 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.740239 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" (UID: "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.747404 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-kube-api-access-7lxr5" (OuterVolumeSpecName: "kube-api-access-7lxr5") pod "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" (UID: "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec"). InnerVolumeSpecName "kube-api-access-7lxr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.748807 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-scripts" (OuterVolumeSpecName: "scripts") pod "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" (UID: "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.776033 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" (UID: "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.839450 4935 generic.go:334] "Generic (PLEG): container finished" podID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerID="a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee" exitCode=0 Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.839530 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec","Type":"ContainerDied","Data":"a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee"} Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.839574 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ac32fe4-e9ce-41b6-a39c-de1219bed1ec","Type":"ContainerDied","Data":"5ff50076dd125443ae23cb6bdd7abffa1e8ac490448cc0ccc33b53c408437af9"} Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.839600 4935 scope.go:117] "RemoveContainer" containerID="147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.839943 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.842671 4935 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.842708 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.842722 4935 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.842737 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lxr5\" (UniqueName: \"kubernetes.io/projected/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-kube-api-access-7lxr5\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.864992 4935 scope.go:117] "RemoveContainer" containerID="f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.868171 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" (UID: "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.868891 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-config-data" (OuterVolumeSpecName: "config-data") pod "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" (UID: "5ac32fe4-e9ce-41b6-a39c-de1219bed1ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.907397 4935 scope.go:117] "RemoveContainer" containerID="a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.937726 4935 scope.go:117] "RemoveContainer" containerID="355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.944536 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.944621 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.960655 4935 scope.go:117] "RemoveContainer" containerID="147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7" Dec 17 09:26:50 crc kubenswrapper[4935]: E1217 09:26:50.966463 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7\": container with ID starting with 147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7 not found: ID does not exist" containerID="147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.966513 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7"} err="failed to get container status \"147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7\": rpc error: code = NotFound desc = could not find container \"147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7\": container with ID starting with 147fa03d6f42a298fb960f2ad3b4a8c8596726a0ce4e4b54fcde86c68a397ad7 not found: ID does not exist" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.966545 4935 scope.go:117] "RemoveContainer" containerID="f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47" Dec 17 09:26:50 crc kubenswrapper[4935]: E1217 09:26:50.966889 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47\": container with ID starting with f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47 not found: ID does not exist" containerID="f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.966911 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47"} err="failed to get container status \"f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47\": rpc error: code = NotFound desc = could not find container \"f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47\": container with ID starting with f01f4a73d941ba694bddcadbbcaef23af5862f75d4ed1954211e4dd121cadd47 not found: ID does not exist" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.966939 4935 scope.go:117] "RemoveContainer" containerID="a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee" Dec 17 09:26:50 crc kubenswrapper[4935]: E1217 09:26:50.967191 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee\": container with ID starting with a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee not found: ID does not exist" containerID="a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.967219 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee"} err="failed to get container status \"a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee\": rpc error: code = NotFound desc = could not find container \"a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee\": container with ID starting with a85a05e6a693e93a3651c1d55e0622f09a2474b28e2d6839b3d48c6a119eb5ee not found: ID does not exist" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.967238 4935 scope.go:117] "RemoveContainer" containerID="355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0" Dec 17 09:26:50 crc kubenswrapper[4935]: E1217 09:26:50.968578 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0\": container with ID starting with 355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0 not found: ID does not exist" containerID="355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0" Dec 17 09:26:50 crc kubenswrapper[4935]: I1217 09:26:50.968605 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0"} err="failed to get container status \"355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0\": rpc error: code = NotFound desc = could not find container \"355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0\": container with ID starting with 355b1df11199f2c870e2363b90d17eff94fc50e915c85d1391169c446cf841f0 not found: ID does not exist" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.106144 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.142343 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.230558 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.256101 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.283318 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:51 crc kubenswrapper[4935]: E1217 09:26:51.284487 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="ceilometer-notification-agent" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.284600 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="ceilometer-notification-agent" Dec 17 09:26:51 crc kubenswrapper[4935]: E1217 09:26:51.284708 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="sg-core" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.284784 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="sg-core" Dec 17 09:26:51 crc kubenswrapper[4935]: E1217 09:26:51.284888 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="proxy-httpd" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.284991 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="proxy-httpd" Dec 17 09:26:51 crc kubenswrapper[4935]: E1217 09:26:51.285074 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="ceilometer-central-agent" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.285140 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="ceilometer-central-agent" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.285468 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="ceilometer-central-agent" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.285563 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="proxy-httpd" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.285649 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="ceilometer-notification-agent" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.285742 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" containerName="sg-core" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.288699 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.291926 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.292602 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.292876 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.295026 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.455484 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.455937 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-run-httpd\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.456491 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.456619 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-scripts\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.456697 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-config-data\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.456768 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlp5x\" (UniqueName: \"kubernetes.io/projected/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-kube-api-access-tlp5x\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.456987 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.457065 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-log-httpd\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.559135 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.559198 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-scripts\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.559232 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-config-data\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.559262 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlp5x\" (UniqueName: \"kubernetes.io/projected/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-kube-api-access-tlp5x\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.559307 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.559328 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-log-httpd\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.559379 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.559482 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-run-httpd\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.560327 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-log-httpd\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.560339 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-run-httpd\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.566185 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.566922 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.570330 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-scripts\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.570832 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-config-data\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.580867 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.581664 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlp5x\" (UniqueName: \"kubernetes.io/projected/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-kube-api-access-tlp5x\") pod \"ceilometer-0\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.609192 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:26:51 crc kubenswrapper[4935]: I1217 09:26:51.897685 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 17 09:26:52 crc kubenswrapper[4935]: I1217 09:26:52.158340 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:26:52 crc kubenswrapper[4935]: W1217 09:26:52.172234 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9be18c4_27ba_4b1a_9364_33fd1d6a39e3.slice/crio-e65ff532908220c2755999296f7e3a947f1e5bc9937920903ffe210d9b4df025 WatchSource:0}: Error finding container e65ff532908220c2755999296f7e3a947f1e5bc9937920903ffe210d9b4df025: Status 404 returned error can't find the container with id e65ff532908220c2755999296f7e3a947f1e5bc9937920903ffe210d9b4df025 Dec 17 09:26:52 crc kubenswrapper[4935]: I1217 09:26:52.176409 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 09:26:52 crc kubenswrapper[4935]: I1217 09:26:52.864956 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3","Type":"ContainerStarted","Data":"e65ff532908220c2755999296f7e3a947f1e5bc9937920903ffe210d9b4df025"} Dec 17 09:26:53 crc kubenswrapper[4935]: I1217 09:26:53.157961 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac32fe4-e9ce-41b6-a39c-de1219bed1ec" path="/var/lib/kubelet/pods/5ac32fe4-e9ce-41b6-a39c-de1219bed1ec/volumes" Dec 17 09:26:53 crc kubenswrapper[4935]: I1217 09:26:53.875120 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3","Type":"ContainerStarted","Data":"83744f72d0b4fbb7c9be9bdfd35dbece3486ea9bc619e2a0b414c96c10152046"} Dec 17 09:26:55 crc kubenswrapper[4935]: I1217 09:26:55.928857 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3","Type":"ContainerStarted","Data":"d841418acfcdf57e0ccccb24a7f1a5814e68a0f7e2f3a9c612f9fa5affe977ff"} Dec 17 09:26:55 crc kubenswrapper[4935]: I1217 09:26:55.929762 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3","Type":"ContainerStarted","Data":"629691029b6763fdb8c8820e662c4a91eb0f367b3fe5c0e3e241101d9bfc0426"} Dec 17 09:26:56 crc kubenswrapper[4935]: I1217 09:26:56.190880 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.785816 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.910558 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkqkq\" (UniqueName: \"kubernetes.io/projected/dbff5cff-ec58-46d1-b09f-da461bc11b44-kube-api-access-zkqkq\") pod \"dbff5cff-ec58-46d1-b09f-da461bc11b44\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.910942 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-combined-ca-bundle\") pod \"dbff5cff-ec58-46d1-b09f-da461bc11b44\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.911008 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-config-data\") pod \"dbff5cff-ec58-46d1-b09f-da461bc11b44\" (UID: \"dbff5cff-ec58-46d1-b09f-da461bc11b44\") " Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.918822 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbff5cff-ec58-46d1-b09f-da461bc11b44-kube-api-access-zkqkq" (OuterVolumeSpecName: "kube-api-access-zkqkq") pod "dbff5cff-ec58-46d1-b09f-da461bc11b44" (UID: "dbff5cff-ec58-46d1-b09f-da461bc11b44"). InnerVolumeSpecName "kube-api-access-zkqkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.956927 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-config-data" (OuterVolumeSpecName: "config-data") pod "dbff5cff-ec58-46d1-b09f-da461bc11b44" (UID: "dbff5cff-ec58-46d1-b09f-da461bc11b44"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.957129 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbff5cff-ec58-46d1-b09f-da461bc11b44" (UID: "dbff5cff-ec58-46d1-b09f-da461bc11b44"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.964676 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3","Type":"ContainerStarted","Data":"778a236232a09fca0d0310f59292c665b771bc397def85806e512396a5754f86"} Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.965068 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.967405 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.967583 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dbff5cff-ec58-46d1-b09f-da461bc11b44","Type":"ContainerDied","Data":"d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c"} Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.967677 4935 scope.go:117] "RemoveContainer" containerID="d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c" Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.968965 4935 generic.go:334] "Generic (PLEG): container finished" podID="dbff5cff-ec58-46d1-b09f-da461bc11b44" containerID="d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c" exitCode=137 Dec 17 09:26:57 crc kubenswrapper[4935]: I1217 09:26:57.969288 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dbff5cff-ec58-46d1-b09f-da461bc11b44","Type":"ContainerDied","Data":"97421659de938e359a6c58469d5a90029a89cccac6b72fb8ba1e01825a22491c"} Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.003717 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.943596829 podStartE2EDuration="7.003689437s" podCreationTimestamp="2025-12-17 09:26:51 +0000 UTC" firstStartedPulling="2025-12-17 09:26:52.176019331 +0000 UTC m=+1331.835860084" lastFinishedPulling="2025-12-17 09:26:57.236111919 +0000 UTC m=+1336.895952692" observedRunningTime="2025-12-17 09:26:57.99140867 +0000 UTC m=+1337.651249443" watchObservedRunningTime="2025-12-17 09:26:58.003689437 +0000 UTC m=+1337.663530200" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.011806 4935 scope.go:117] "RemoveContainer" containerID="d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c" Dec 17 09:26:58 crc kubenswrapper[4935]: E1217 09:26:58.013907 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c\": container with ID starting with d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c not found: ID does not exist" containerID="d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.013968 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c"} err="failed to get container status \"d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c\": rpc error: code = NotFound desc = could not find container \"d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c\": container with ID starting with d7fb3a240bf8a7c49ab342e842f52eee20e2be6af21d4b50d36bccc2e326c82c not found: ID does not exist" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.017908 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.017955 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dbff5cff-ec58-46d1-b09f-da461bc11b44-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.017969 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkqkq\" (UniqueName: \"kubernetes.io/projected/dbff5cff-ec58-46d1-b09f-da461bc11b44-kube-api-access-zkqkq\") on node \"crc\" DevicePath \"\"" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.020893 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.029813 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.030036 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.038862 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.050016 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 17 09:26:58 crc kubenswrapper[4935]: E1217 09:26:58.050661 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbff5cff-ec58-46d1-b09f-da461bc11b44" containerName="nova-cell1-novncproxy-novncproxy" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.050682 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbff5cff-ec58-46d1-b09f-da461bc11b44" containerName="nova-cell1-novncproxy-novncproxy" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.051063 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbff5cff-ec58-46d1-b09f-da461bc11b44" containerName="nova-cell1-novncproxy-novncproxy" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.052289 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.058075 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.058131 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.058332 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.078440 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.091696 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.121859 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.222194 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.222514 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlzlc\" (UniqueName: \"kubernetes.io/projected/b941dd61-25eb-4443-a4cf-356fbe73f67b-kube-api-access-nlzlc\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.222584 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.222984 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.223121 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.325426 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlzlc\" (UniqueName: \"kubernetes.io/projected/b941dd61-25eb-4443-a4cf-356fbe73f67b-kube-api-access-nlzlc\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.325475 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.325529 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.325567 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.325627 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.335829 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.335962 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.337110 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.345177 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b941dd61-25eb-4443-a4cf-356fbe73f67b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.349615 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlzlc\" (UniqueName: \"kubernetes.io/projected/b941dd61-25eb-4443-a4cf-356fbe73f67b-kube-api-access-nlzlc\") pod \"nova-cell1-novncproxy-0\" (UID: \"b941dd61-25eb-4443-a4cf-356fbe73f67b\") " pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:58 crc kubenswrapper[4935]: I1217 09:26:58.386846 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:26:59 crc kubenswrapper[4935]: I1217 09:26:59.063975 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 17 09:26:59 crc kubenswrapper[4935]: I1217 09:26:59.066656 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 17 09:26:59 crc kubenswrapper[4935]: I1217 09:26:59.068717 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 17 09:26:59 crc kubenswrapper[4935]: I1217 09:26:59.073788 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 17 09:26:59 crc kubenswrapper[4935]: W1217 09:26:59.079587 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb941dd61_25eb_4443_a4cf_356fbe73f67b.slice/crio-c801f30e9cec4c1e1d2309d99c32138ab90cdc48e038a79a15b1feca1f2fc62e WatchSource:0}: Error finding container c801f30e9cec4c1e1d2309d99c32138ab90cdc48e038a79a15b1feca1f2fc62e: Status 404 returned error can't find the container with id c801f30e9cec4c1e1d2309d99c32138ab90cdc48e038a79a15b1feca1f2fc62e Dec 17 09:26:59 crc kubenswrapper[4935]: I1217 09:26:59.081914 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 17 09:26:59 crc kubenswrapper[4935]: I1217 09:26:59.139000 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbff5cff-ec58-46d1-b09f-da461bc11b44" path="/var/lib/kubelet/pods/dbff5cff-ec58-46d1-b09f-da461bc11b44/volumes" Dec 17 09:26:59 crc kubenswrapper[4935]: I1217 09:26:59.996354 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b941dd61-25eb-4443-a4cf-356fbe73f67b","Type":"ContainerStarted","Data":"60179f3e6372286e68d509f04e475b360c350ae66451a0f941ccaeb37313081b"} Dec 17 09:26:59 crc kubenswrapper[4935]: I1217 09:26:59.997027 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b941dd61-25eb-4443-a4cf-356fbe73f67b","Type":"ContainerStarted","Data":"c801f30e9cec4c1e1d2309d99c32138ab90cdc48e038a79a15b1feca1f2fc62e"} Dec 17 09:26:59 crc kubenswrapper[4935]: I1217 09:26:59.997119 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.003461 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.033046 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.033015513 podStartE2EDuration="2.033015513s" podCreationTimestamp="2025-12-17 09:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:27:00.017458536 +0000 UTC m=+1339.677299319" watchObservedRunningTime="2025-12-17 09:27:00.033015513 +0000 UTC m=+1339.692856276" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.135010 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.137446 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.367096 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-bgbn2"] Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.371791 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.383949 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-bgbn2"] Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.627417 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.627492 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-svc\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.627552 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rr8s\" (UniqueName: \"kubernetes.io/projected/95d6599e-4042-4043-a505-9e1ca6037bf2-kube-api-access-2rr8s\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.627583 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.627656 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.627863 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-config\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.729631 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.729781 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-config\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.730823 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.730984 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-config\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.731188 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.732098 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.733082 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-svc\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.733130 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-svc\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.733219 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rr8s\" (UniqueName: \"kubernetes.io/projected/95d6599e-4042-4043-a505-9e1ca6037bf2-kube-api-access-2rr8s\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.733821 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.735010 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:00 crc kubenswrapper[4935]: I1217 09:27:00.763404 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rr8s\" (UniqueName: \"kubernetes.io/projected/95d6599e-4042-4043-a505-9e1ca6037bf2-kube-api-access-2rr8s\") pod \"dnsmasq-dns-5ddd577785-bgbn2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:01 crc kubenswrapper[4935]: I1217 09:27:01.021931 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:01 crc kubenswrapper[4935]: W1217 09:27:01.553169 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d6599e_4042_4043_a505_9e1ca6037bf2.slice/crio-ae57d86c0f132096d203b25527a449e80e887af29f2d49091d9bbf9a3b7e9acc WatchSource:0}: Error finding container ae57d86c0f132096d203b25527a449e80e887af29f2d49091d9bbf9a3b7e9acc: Status 404 returned error can't find the container with id ae57d86c0f132096d203b25527a449e80e887af29f2d49091d9bbf9a3b7e9acc Dec 17 09:27:01 crc kubenswrapper[4935]: I1217 09:27:01.564981 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-bgbn2"] Dec 17 09:27:02 crc kubenswrapper[4935]: I1217 09:27:02.037366 4935 generic.go:334] "Generic (PLEG): container finished" podID="95d6599e-4042-4043-a505-9e1ca6037bf2" containerID="18bbb5fbed6234e6527ef3578c016b6b16ab402b161450c578a5d4a6a4e457fc" exitCode=0 Dec 17 09:27:02 crc kubenswrapper[4935]: I1217 09:27:02.038850 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" event={"ID":"95d6599e-4042-4043-a505-9e1ca6037bf2","Type":"ContainerDied","Data":"18bbb5fbed6234e6527ef3578c016b6b16ab402b161450c578a5d4a6a4e457fc"} Dec 17 09:27:02 crc kubenswrapper[4935]: I1217 09:27:02.038899 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" event={"ID":"95d6599e-4042-4043-a505-9e1ca6037bf2","Type":"ContainerStarted","Data":"ae57d86c0f132096d203b25527a449e80e887af29f2d49091d9bbf9a3b7e9acc"} Dec 17 09:27:02 crc kubenswrapper[4935]: E1217 09:27:02.069764 4935 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d6599e_4042_4043_a505_9e1ca6037bf2.slice/crio-conmon-18bbb5fbed6234e6527ef3578c016b6b16ab402b161450c578a5d4a6a4e457fc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d6599e_4042_4043_a505_9e1ca6037bf2.slice/crio-18bbb5fbed6234e6527ef3578c016b6b16ab402b161450c578a5d4a6a4e457fc.scope\": RecentStats: unable to find data in memory cache]" Dec 17 09:27:02 crc kubenswrapper[4935]: I1217 09:27:02.920408 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:03 crc kubenswrapper[4935]: I1217 09:27:03.065874 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" event={"ID":"95d6599e-4042-4043-a505-9e1ca6037bf2","Type":"ContainerStarted","Data":"1f8aa261dff5dd5c48b20c4e7d4ed95b443f039de1ab6dca9a2bbf7701779967"} Dec 17 09:27:03 crc kubenswrapper[4935]: I1217 09:27:03.066041 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerName="nova-api-log" containerID="cri-o://d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc" gracePeriod=30 Dec 17 09:27:03 crc kubenswrapper[4935]: I1217 09:27:03.066251 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerName="nova-api-api" containerID="cri-o://f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110" gracePeriod=30 Dec 17 09:27:03 crc kubenswrapper[4935]: I1217 09:27:03.103418 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" podStartSLOduration=3.103396176 podStartE2EDuration="3.103396176s" podCreationTimestamp="2025-12-17 09:27:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:27:03.09941701 +0000 UTC m=+1342.759257793" watchObservedRunningTime="2025-12-17 09:27:03.103396176 +0000 UTC m=+1342.763236939" Dec 17 09:27:03 crc kubenswrapper[4935]: I1217 09:27:03.387016 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:27:03 crc kubenswrapper[4935]: I1217 09:27:03.529690 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:03 crc kubenswrapper[4935]: I1217 09:27:03.530110 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="ceilometer-central-agent" containerID="cri-o://83744f72d0b4fbb7c9be9bdfd35dbece3486ea9bc619e2a0b414c96c10152046" gracePeriod=30 Dec 17 09:27:03 crc kubenswrapper[4935]: I1217 09:27:03.530170 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="proxy-httpd" containerID="cri-o://778a236232a09fca0d0310f59292c665b771bc397def85806e512396a5754f86" gracePeriod=30 Dec 17 09:27:03 crc kubenswrapper[4935]: I1217 09:27:03.530192 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="sg-core" containerID="cri-o://629691029b6763fdb8c8820e662c4a91eb0f367b3fe5c0e3e241101d9bfc0426" gracePeriod=30 Dec 17 09:27:03 crc kubenswrapper[4935]: I1217 09:27:03.530205 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="ceilometer-notification-agent" containerID="cri-o://d841418acfcdf57e0ccccb24a7f1a5814e68a0f7e2f3a9c612f9fa5affe977ff" gracePeriod=30 Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.087622 4935 generic.go:334] "Generic (PLEG): container finished" podID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerID="d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc" exitCode=143 Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.087725 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5","Type":"ContainerDied","Data":"d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc"} Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.093716 4935 generic.go:334] "Generic (PLEG): container finished" podID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerID="778a236232a09fca0d0310f59292c665b771bc397def85806e512396a5754f86" exitCode=0 Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.093753 4935 generic.go:334] "Generic (PLEG): container finished" podID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerID="629691029b6763fdb8c8820e662c4a91eb0f367b3fe5c0e3e241101d9bfc0426" exitCode=2 Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.093766 4935 generic.go:334] "Generic (PLEG): container finished" podID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerID="d841418acfcdf57e0ccccb24a7f1a5814e68a0f7e2f3a9c612f9fa5affe977ff" exitCode=0 Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.093775 4935 generic.go:334] "Generic (PLEG): container finished" podID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerID="83744f72d0b4fbb7c9be9bdfd35dbece3486ea9bc619e2a0b414c96c10152046" exitCode=0 Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.093796 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3","Type":"ContainerDied","Data":"778a236232a09fca0d0310f59292c665b771bc397def85806e512396a5754f86"} Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.093884 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3","Type":"ContainerDied","Data":"629691029b6763fdb8c8820e662c4a91eb0f367b3fe5c0e3e241101d9bfc0426"} Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.093909 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3","Type":"ContainerDied","Data":"d841418acfcdf57e0ccccb24a7f1a5814e68a0f7e2f3a9c612f9fa5affe977ff"} Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.093930 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3","Type":"ContainerDied","Data":"83744f72d0b4fbb7c9be9bdfd35dbece3486ea9bc619e2a0b414c96c10152046"} Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.094087 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.552774 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.664450 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-config-data\") pod \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.664517 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-ceilometer-tls-certs\") pod \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.664573 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-run-httpd\") pod \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.664670 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-scripts\") pod \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.664766 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-combined-ca-bundle\") pod \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.664812 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-log-httpd\") pod \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.664974 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlp5x\" (UniqueName: \"kubernetes.io/projected/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-kube-api-access-tlp5x\") pod \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.665069 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-sg-core-conf-yaml\") pod \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\" (UID: \"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3\") " Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.666195 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" (UID: "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.666322 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" (UID: "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.672029 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-scripts" (OuterVolumeSpecName: "scripts") pod "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" (UID: "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.672482 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-kube-api-access-tlp5x" (OuterVolumeSpecName: "kube-api-access-tlp5x") pod "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" (UID: "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3"). InnerVolumeSpecName "kube-api-access-tlp5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.699915 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" (UID: "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.729677 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" (UID: "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.751003 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" (UID: "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.767928 4935 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.767973 4935 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.767983 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.767993 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.768005 4935 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.768017 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlp5x\" (UniqueName: \"kubernetes.io/projected/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-kube-api-access-tlp5x\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.768027 4935 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.790121 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-config-data" (OuterVolumeSpecName: "config-data") pod "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" (UID: "e9be18c4-27ba-4b1a-9364-33fd1d6a39e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:04 crc kubenswrapper[4935]: I1217 09:27:04.870219 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.107300 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e9be18c4-27ba-4b1a-9364-33fd1d6a39e3","Type":"ContainerDied","Data":"e65ff532908220c2755999296f7e3a947f1e5bc9937920903ffe210d9b4df025"} Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.107425 4935 scope.go:117] "RemoveContainer" containerID="778a236232a09fca0d0310f59292c665b771bc397def85806e512396a5754f86" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.107339 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.158261 4935 scope.go:117] "RemoveContainer" containerID="629691029b6763fdb8c8820e662c4a91eb0f367b3fe5c0e3e241101d9bfc0426" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.159987 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.169664 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.184433 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:05 crc kubenswrapper[4935]: E1217 09:27:05.185140 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="ceilometer-central-agent" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.185172 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="ceilometer-central-agent" Dec 17 09:27:05 crc kubenswrapper[4935]: E1217 09:27:05.185182 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="ceilometer-notification-agent" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.185192 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="ceilometer-notification-agent" Dec 17 09:27:05 crc kubenswrapper[4935]: E1217 09:27:05.185222 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="proxy-httpd" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.185232 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="proxy-httpd" Dec 17 09:27:05 crc kubenswrapper[4935]: E1217 09:27:05.185268 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="sg-core" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.185297 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="sg-core" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.185553 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="ceilometer-central-agent" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.185595 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="ceilometer-notification-agent" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.185611 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="sg-core" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.185626 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" containerName="proxy-httpd" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.187737 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.192673 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.193740 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.194048 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.194248 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.204693 4935 scope.go:117] "RemoveContainer" containerID="d841418acfcdf57e0ccccb24a7f1a5814e68a0f7e2f3a9c612f9fa5affe977ff" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.238321 4935 scope.go:117] "RemoveContainer" containerID="83744f72d0b4fbb7c9be9bdfd35dbece3486ea9bc619e2a0b414c96c10152046" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.280391 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-run-httpd\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.280444 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rxd\" (UniqueName: \"kubernetes.io/projected/5d8b192d-6898-4e6a-b442-63985cf34098-kube-api-access-w8rxd\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.280493 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-config-data\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.280541 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.280557 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-log-httpd\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.280589 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-scripts\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.280614 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.280676 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.383896 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-scripts\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.383979 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.384169 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.384239 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-run-httpd\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.384290 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rxd\" (UniqueName: \"kubernetes.io/projected/5d8b192d-6898-4e6a-b442-63985cf34098-kube-api-access-w8rxd\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.384344 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-config-data\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.384395 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.384413 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-log-httpd\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.385020 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-log-httpd\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.385019 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-run-httpd\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.389440 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-scripts\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.390042 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-config-data\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.390564 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.391683 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.391844 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.408942 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rxd\" (UniqueName: \"kubernetes.io/projected/5d8b192d-6898-4e6a-b442-63985cf34098-kube-api-access-w8rxd\") pod \"ceilometer-0\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.510036 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:27:05 crc kubenswrapper[4935]: I1217 09:27:05.826634 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.019995 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:06 crc kubenswrapper[4935]: W1217 09:27:06.028423 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d8b192d_6898_4e6a_b442_63985cf34098.slice/crio-469cc48d80d25b62be8c729d1669e662c6462c18c298116c9a12c245b824f3a9 WatchSource:0}: Error finding container 469cc48d80d25b62be8c729d1669e662c6462c18c298116c9a12c245b824f3a9: Status 404 returned error can't find the container with id 469cc48d80d25b62be8c729d1669e662c6462c18c298116c9a12c245b824f3a9 Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.122205 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d8b192d-6898-4e6a-b442-63985cf34098","Type":"ContainerStarted","Data":"469cc48d80d25b62be8c729d1669e662c6462c18c298116c9a12c245b824f3a9"} Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.697181 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.710144 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfnj7\" (UniqueName: \"kubernetes.io/projected/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-kube-api-access-qfnj7\") pod \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.710262 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-logs\") pod \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.710401 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-config-data\") pod \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.711208 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-logs" (OuterVolumeSpecName: "logs") pod "6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" (UID: "6964e606-b374-4cc9-a3fe-b31c8dc0c6d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.711711 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-combined-ca-bundle\") pod \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\" (UID: \"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5\") " Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.713888 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.725584 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-kube-api-access-qfnj7" (OuterVolumeSpecName: "kube-api-access-qfnj7") pod "6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" (UID: "6964e606-b374-4cc9-a3fe-b31c8dc0c6d5"). InnerVolumeSpecName "kube-api-access-qfnj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.779649 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" (UID: "6964e606-b374-4cc9-a3fe-b31c8dc0c6d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.781398 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-config-data" (OuterVolumeSpecName: "config-data") pod "6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" (UID: "6964e606-b374-4cc9-a3fe-b31c8dc0c6d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.819075 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfnj7\" (UniqueName: \"kubernetes.io/projected/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-kube-api-access-qfnj7\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.819121 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:06 crc kubenswrapper[4935]: I1217 09:27:06.819136 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.140637 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9be18c4-27ba-4b1a-9364-33fd1d6a39e3" path="/var/lib/kubelet/pods/e9be18c4-27ba-4b1a-9364-33fd1d6a39e3/volumes" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.150463 4935 generic.go:334] "Generic (PLEG): container finished" podID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerID="f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110" exitCode=0 Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.150561 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5","Type":"ContainerDied","Data":"f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110"} Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.150583 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.150599 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6964e606-b374-4cc9-a3fe-b31c8dc0c6d5","Type":"ContainerDied","Data":"e0b0a84400db3be185b1c683b350ffadeb1530ea5dc49a4c2ca207b0de1e6583"} Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.150621 4935 scope.go:117] "RemoveContainer" containerID="f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.158399 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d8b192d-6898-4e6a-b442-63985cf34098","Type":"ContainerStarted","Data":"66aa8232c6e694cf0fc325dbd7b5842044aaccb7720f9cf9a1d4c5fc5e172360"} Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.191489 4935 scope.go:117] "RemoveContainer" containerID="d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.192144 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.202555 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.231925 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:07 crc kubenswrapper[4935]: E1217 09:27:07.232538 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerName="nova-api-log" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.232572 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerName="nova-api-log" Dec 17 09:27:07 crc kubenswrapper[4935]: E1217 09:27:07.232586 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerName="nova-api-api" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.232593 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerName="nova-api-api" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.232803 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerName="nova-api-api" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.232826 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" containerName="nova-api-log" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.234292 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.234789 4935 scope.go:117] "RemoveContainer" containerID="f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110" Dec 17 09:27:07 crc kubenswrapper[4935]: E1217 09:27:07.236090 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110\": container with ID starting with f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110 not found: ID does not exist" containerID="f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.236132 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110"} err="failed to get container status \"f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110\": rpc error: code = NotFound desc = could not find container \"f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110\": container with ID starting with f84ad410a5d320e429a7bf6db1f6cb6dc76a545e549e1293c84872d4bc5d5110 not found: ID does not exist" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.236169 4935 scope.go:117] "RemoveContainer" containerID="d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc" Dec 17 09:27:07 crc kubenswrapper[4935]: E1217 09:27:07.241182 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc\": container with ID starting with d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc not found: ID does not exist" containerID="d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.241242 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc"} err="failed to get container status \"d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc\": rpc error: code = NotFound desc = could not find container \"d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc\": container with ID starting with d7774048841704bca7f27eae1e447a9683e3970f7e0da1803586fd413933f0bc not found: ID does not exist" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.241445 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.241457 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.241919 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.254878 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.332715 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkkn2\" (UniqueName: \"kubernetes.io/projected/0c55b389-0049-49a0-a305-e3118ea777da-kube-api-access-kkkn2\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.333103 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-config-data\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.333172 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.333415 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-public-tls-certs\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.333601 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.333709 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c55b389-0049-49a0-a305-e3118ea777da-logs\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.435782 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-config-data\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.436325 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.436390 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-public-tls-certs\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.436455 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.436514 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c55b389-0049-49a0-a305-e3118ea777da-logs\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.436555 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkkn2\" (UniqueName: \"kubernetes.io/projected/0c55b389-0049-49a0-a305-e3118ea777da-kube-api-access-kkkn2\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.437450 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c55b389-0049-49a0-a305-e3118ea777da-logs\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.441264 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-internal-tls-certs\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.442108 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.443151 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-config-data\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.445937 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-public-tls-certs\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.457255 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkkn2\" (UniqueName: \"kubernetes.io/projected/0c55b389-0049-49a0-a305-e3118ea777da-kube-api-access-kkkn2\") pod \"nova-api-0\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " pod="openstack/nova-api-0" Dec 17 09:27:07 crc kubenswrapper[4935]: I1217 09:27:07.557310 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:27:08 crc kubenswrapper[4935]: I1217 09:27:08.093137 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:08 crc kubenswrapper[4935]: W1217 09:27:08.101544 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c55b389_0049_49a0_a305_e3118ea777da.slice/crio-e06d8070f5e9b0ff0876d25d4e22de2b473dbcae40ec4678355218e342e0f087 WatchSource:0}: Error finding container e06d8070f5e9b0ff0876d25d4e22de2b473dbcae40ec4678355218e342e0f087: Status 404 returned error can't find the container with id e06d8070f5e9b0ff0876d25d4e22de2b473dbcae40ec4678355218e342e0f087 Dec 17 09:27:08 crc kubenswrapper[4935]: I1217 09:27:08.188094 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d8b192d-6898-4e6a-b442-63985cf34098","Type":"ContainerStarted","Data":"2c9962e9c53140e99e884b357832def79baa361e42273bd0224100d32e8f45ec"} Dec 17 09:27:08 crc kubenswrapper[4935]: I1217 09:27:08.190729 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c55b389-0049-49a0-a305-e3118ea777da","Type":"ContainerStarted","Data":"e06d8070f5e9b0ff0876d25d4e22de2b473dbcae40ec4678355218e342e0f087"} Dec 17 09:27:08 crc kubenswrapper[4935]: I1217 09:27:08.388175 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:27:08 crc kubenswrapper[4935]: I1217 09:27:08.431760 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.136500 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6964e606-b374-4cc9-a3fe-b31c8dc0c6d5" path="/var/lib/kubelet/pods/6964e606-b374-4cc9-a3fe-b31c8dc0c6d5/volumes" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.207163 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c55b389-0049-49a0-a305-e3118ea777da","Type":"ContainerStarted","Data":"335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280"} Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.207664 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c55b389-0049-49a0-a305-e3118ea777da","Type":"ContainerStarted","Data":"5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6"} Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.213107 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d8b192d-6898-4e6a-b442-63985cf34098","Type":"ContainerStarted","Data":"36bab1ab0080ed8bdef02c4f3b3b63f52844437928e73de8803b2ba6ae52d29b"} Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.239651 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.239624152 podStartE2EDuration="2.239624152s" podCreationTimestamp="2025-12-17 09:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:27:09.229023456 +0000 UTC m=+1348.888864219" watchObservedRunningTime="2025-12-17 09:27:09.239624152 +0000 UTC m=+1348.899464905" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.239841 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.530056 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-slw7z"] Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.532944 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.536413 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.536468 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.545932 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-slw7z"] Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.690213 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48dq4\" (UniqueName: \"kubernetes.io/projected/5230b826-60b1-4db8-a7a0-e63c356fcc72-kube-api-access-48dq4\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.690282 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-scripts\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.690451 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.690540 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-config-data\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.792900 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48dq4\" (UniqueName: \"kubernetes.io/projected/5230b826-60b1-4db8-a7a0-e63c356fcc72-kube-api-access-48dq4\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.792978 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-scripts\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.793060 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.793145 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-config-data\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.803262 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-scripts\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.803446 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-config-data\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.808974 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.822043 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48dq4\" (UniqueName: \"kubernetes.io/projected/5230b826-60b1-4db8-a7a0-e63c356fcc72-kube-api-access-48dq4\") pod \"nova-cell1-cell-mapping-slw7z\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:09 crc kubenswrapper[4935]: I1217 09:27:09.859838 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:10 crc kubenswrapper[4935]: I1217 09:27:10.372082 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-slw7z"] Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.024459 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.095102 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-f6dp6"] Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.096071 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" podUID="88cf002b-f718-4ada-9094-0484202077cb" containerName="dnsmasq-dns" containerID="cri-o://ca68bdfd42610ebd1d2678e1de14a6857034a15a6097a7cb31f9c7d0e02237cc" gracePeriod=10 Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.249813 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-slw7z" event={"ID":"5230b826-60b1-4db8-a7a0-e63c356fcc72","Type":"ContainerStarted","Data":"67f67f65ca0a4cd9cd9b72e4c1a34dc08ede7bee29601513cb8de09b8358403e"} Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.249869 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-slw7z" event={"ID":"5230b826-60b1-4db8-a7a0-e63c356fcc72","Type":"ContainerStarted","Data":"a0193c00b7850dadb4fe9f2a54371c51b2fe0b57dcb4ce70fcfb916228af610a"} Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.262132 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" event={"ID":"88cf002b-f718-4ada-9094-0484202077cb","Type":"ContainerDied","Data":"ca68bdfd42610ebd1d2678e1de14a6857034a15a6097a7cb31f9c7d0e02237cc"} Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.262100 4935 generic.go:334] "Generic (PLEG): container finished" podID="88cf002b-f718-4ada-9094-0484202077cb" containerID="ca68bdfd42610ebd1d2678e1de14a6857034a15a6097a7cb31f9c7d0e02237cc" exitCode=0 Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.272707 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-slw7z" podStartSLOduration=2.2726875189999998 podStartE2EDuration="2.272687519s" podCreationTimestamp="2025-12-17 09:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:27:11.270368063 +0000 UTC m=+1350.930208826" watchObservedRunningTime="2025-12-17 09:27:11.272687519 +0000 UTC m=+1350.932528282" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.278169 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d8b192d-6898-4e6a-b442-63985cf34098","Type":"ContainerStarted","Data":"2e581075e1d1ff9f2f248f8ca217cfc19fc71618b2bb5c97e986069300ff193a"} Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.278434 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="ceilometer-central-agent" containerID="cri-o://66aa8232c6e694cf0fc325dbd7b5842044aaccb7720f9cf9a1d4c5fc5e172360" gracePeriod=30 Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.278872 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.279386 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="proxy-httpd" containerID="cri-o://2e581075e1d1ff9f2f248f8ca217cfc19fc71618b2bb5c97e986069300ff193a" gracePeriod=30 Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.279474 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="sg-core" containerID="cri-o://36bab1ab0080ed8bdef02c4f3b3b63f52844437928e73de8803b2ba6ae52d29b" gracePeriod=30 Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.279552 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="ceilometer-notification-agent" containerID="cri-o://2c9962e9c53140e99e884b357832def79baa361e42273bd0224100d32e8f45ec" gracePeriod=30 Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.586453 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.618149 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.761335243 podStartE2EDuration="6.618123243s" podCreationTimestamp="2025-12-17 09:27:05 +0000 UTC" firstStartedPulling="2025-12-17 09:27:06.032134555 +0000 UTC m=+1345.691975318" lastFinishedPulling="2025-12-17 09:27:10.888922555 +0000 UTC m=+1350.548763318" observedRunningTime="2025-12-17 09:27:11.320800406 +0000 UTC m=+1350.980641179" watchObservedRunningTime="2025-12-17 09:27:11.618123243 +0000 UTC m=+1351.277964006" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.751459 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-sb\") pod \"88cf002b-f718-4ada-9094-0484202077cb\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.751547 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-swift-storage-0\") pod \"88cf002b-f718-4ada-9094-0484202077cb\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.751603 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2vhs\" (UniqueName: \"kubernetes.io/projected/88cf002b-f718-4ada-9094-0484202077cb-kube-api-access-x2vhs\") pod \"88cf002b-f718-4ada-9094-0484202077cb\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.751812 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-svc\") pod \"88cf002b-f718-4ada-9094-0484202077cb\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.751849 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-nb\") pod \"88cf002b-f718-4ada-9094-0484202077cb\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.751881 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-config\") pod \"88cf002b-f718-4ada-9094-0484202077cb\" (UID: \"88cf002b-f718-4ada-9094-0484202077cb\") " Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.759721 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88cf002b-f718-4ada-9094-0484202077cb-kube-api-access-x2vhs" (OuterVolumeSpecName: "kube-api-access-x2vhs") pod "88cf002b-f718-4ada-9094-0484202077cb" (UID: "88cf002b-f718-4ada-9094-0484202077cb"). InnerVolumeSpecName "kube-api-access-x2vhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.808769 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-config" (OuterVolumeSpecName: "config") pod "88cf002b-f718-4ada-9094-0484202077cb" (UID: "88cf002b-f718-4ada-9094-0484202077cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.810907 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "88cf002b-f718-4ada-9094-0484202077cb" (UID: "88cf002b-f718-4ada-9094-0484202077cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.814598 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "88cf002b-f718-4ada-9094-0484202077cb" (UID: "88cf002b-f718-4ada-9094-0484202077cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.814741 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "88cf002b-f718-4ada-9094-0484202077cb" (UID: "88cf002b-f718-4ada-9094-0484202077cb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.825450 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "88cf002b-f718-4ada-9094-0484202077cb" (UID: "88cf002b-f718-4ada-9094-0484202077cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.854290 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.854329 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.854342 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.854353 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.854363 4935 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/88cf002b-f718-4ada-9094-0484202077cb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:11 crc kubenswrapper[4935]: I1217 09:27:11.854374 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2vhs\" (UniqueName: \"kubernetes.io/projected/88cf002b-f718-4ada-9094-0484202077cb-kube-api-access-x2vhs\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:12 crc kubenswrapper[4935]: I1217 09:27:12.295817 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" event={"ID":"88cf002b-f718-4ada-9094-0484202077cb","Type":"ContainerDied","Data":"33265885f0c888f0d2dbf2527d31706b1a58a65ac8c6fb2ed85ad83dd840e84f"} Dec 17 09:27:12 crc kubenswrapper[4935]: I1217 09:27:12.295910 4935 scope.go:117] "RemoveContainer" containerID="ca68bdfd42610ebd1d2678e1de14a6857034a15a6097a7cb31f9c7d0e02237cc" Dec 17 09:27:12 crc kubenswrapper[4935]: I1217 09:27:12.296143 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-f6dp6" Dec 17 09:27:12 crc kubenswrapper[4935]: I1217 09:27:12.307652 4935 generic.go:334] "Generic (PLEG): container finished" podID="5d8b192d-6898-4e6a-b442-63985cf34098" containerID="36bab1ab0080ed8bdef02c4f3b3b63f52844437928e73de8803b2ba6ae52d29b" exitCode=2 Dec 17 09:27:12 crc kubenswrapper[4935]: I1217 09:27:12.307993 4935 generic.go:334] "Generic (PLEG): container finished" podID="5d8b192d-6898-4e6a-b442-63985cf34098" containerID="2c9962e9c53140e99e884b357832def79baa361e42273bd0224100d32e8f45ec" exitCode=0 Dec 17 09:27:12 crc kubenswrapper[4935]: I1217 09:27:12.309209 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d8b192d-6898-4e6a-b442-63985cf34098","Type":"ContainerDied","Data":"36bab1ab0080ed8bdef02c4f3b3b63f52844437928e73de8803b2ba6ae52d29b"} Dec 17 09:27:12 crc kubenswrapper[4935]: I1217 09:27:12.309296 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d8b192d-6898-4e6a-b442-63985cf34098","Type":"ContainerDied","Data":"2c9962e9c53140e99e884b357832def79baa361e42273bd0224100d32e8f45ec"} Dec 17 09:27:12 crc kubenswrapper[4935]: I1217 09:27:12.329931 4935 scope.go:117] "RemoveContainer" containerID="e47d085119a866b8185db346dc6ad1dfc007295885ef5bdc9435a114646ba4ce" Dec 17 09:27:12 crc kubenswrapper[4935]: I1217 09:27:12.343608 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-f6dp6"] Dec 17 09:27:12 crc kubenswrapper[4935]: I1217 09:27:12.353981 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-f6dp6"] Dec 17 09:27:13 crc kubenswrapper[4935]: I1217 09:27:13.138778 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88cf002b-f718-4ada-9094-0484202077cb" path="/var/lib/kubelet/pods/88cf002b-f718-4ada-9094-0484202077cb/volumes" Dec 17 09:27:14 crc kubenswrapper[4935]: I1217 09:27:14.333268 4935 generic.go:334] "Generic (PLEG): container finished" podID="5d8b192d-6898-4e6a-b442-63985cf34098" containerID="66aa8232c6e694cf0fc325dbd7b5842044aaccb7720f9cf9a1d4c5fc5e172360" exitCode=0 Dec 17 09:27:14 crc kubenswrapper[4935]: I1217 09:27:14.333518 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d8b192d-6898-4e6a-b442-63985cf34098","Type":"ContainerDied","Data":"66aa8232c6e694cf0fc325dbd7b5842044aaccb7720f9cf9a1d4c5fc5e172360"} Dec 17 09:27:17 crc kubenswrapper[4935]: I1217 09:27:17.370529 4935 generic.go:334] "Generic (PLEG): container finished" podID="5230b826-60b1-4db8-a7a0-e63c356fcc72" containerID="67f67f65ca0a4cd9cd9b72e4c1a34dc08ede7bee29601513cb8de09b8358403e" exitCode=0 Dec 17 09:27:17 crc kubenswrapper[4935]: I1217 09:27:17.370619 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-slw7z" event={"ID":"5230b826-60b1-4db8-a7a0-e63c356fcc72","Type":"ContainerDied","Data":"67f67f65ca0a4cd9cd9b72e4c1a34dc08ede7bee29601513cb8de09b8358403e"} Dec 17 09:27:17 crc kubenswrapper[4935]: I1217 09:27:17.557956 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 17 09:27:17 crc kubenswrapper[4935]: I1217 09:27:17.558022 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.574695 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c55b389-0049-49a0-a305-e3118ea777da" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.576168 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0c55b389-0049-49a0-a305-e3118ea777da" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.784917 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.938619 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-scripts\") pod \"5230b826-60b1-4db8-a7a0-e63c356fcc72\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.938719 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-combined-ca-bundle\") pod \"5230b826-60b1-4db8-a7a0-e63c356fcc72\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.938836 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-config-data\") pod \"5230b826-60b1-4db8-a7a0-e63c356fcc72\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.938952 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48dq4\" (UniqueName: \"kubernetes.io/projected/5230b826-60b1-4db8-a7a0-e63c356fcc72-kube-api-access-48dq4\") pod \"5230b826-60b1-4db8-a7a0-e63c356fcc72\" (UID: \"5230b826-60b1-4db8-a7a0-e63c356fcc72\") " Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.948126 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-scripts" (OuterVolumeSpecName: "scripts") pod "5230b826-60b1-4db8-a7a0-e63c356fcc72" (UID: "5230b826-60b1-4db8-a7a0-e63c356fcc72"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.949744 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5230b826-60b1-4db8-a7a0-e63c356fcc72-kube-api-access-48dq4" (OuterVolumeSpecName: "kube-api-access-48dq4") pod "5230b826-60b1-4db8-a7a0-e63c356fcc72" (UID: "5230b826-60b1-4db8-a7a0-e63c356fcc72"). InnerVolumeSpecName "kube-api-access-48dq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.991316 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-config-data" (OuterVolumeSpecName: "config-data") pod "5230b826-60b1-4db8-a7a0-e63c356fcc72" (UID: "5230b826-60b1-4db8-a7a0-e63c356fcc72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:18 crc kubenswrapper[4935]: I1217 09:27:18.993339 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5230b826-60b1-4db8-a7a0-e63c356fcc72" (UID: "5230b826-60b1-4db8-a7a0-e63c356fcc72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.041877 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.042068 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.042158 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5230b826-60b1-4db8-a7a0-e63c356fcc72-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.042219 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48dq4\" (UniqueName: \"kubernetes.io/projected/5230b826-60b1-4db8-a7a0-e63c356fcc72-kube-api-access-48dq4\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.396343 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-slw7z" event={"ID":"5230b826-60b1-4db8-a7a0-e63c356fcc72","Type":"ContainerDied","Data":"a0193c00b7850dadb4fe9f2a54371c51b2fe0b57dcb4ce70fcfb916228af610a"} Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.396842 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0193c00b7850dadb4fe9f2a54371c51b2fe0b57dcb4ce70fcfb916228af610a" Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.396434 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-slw7z" Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.606836 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.607201 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0c55b389-0049-49a0-a305-e3118ea777da" containerName="nova-api-log" containerID="cri-o://5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6" gracePeriod=30 Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.607266 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0c55b389-0049-49a0-a305-e3118ea777da" containerName="nova-api-api" containerID="cri-o://335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280" gracePeriod=30 Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.624128 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.624464 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="44959a00-28e4-4914-8596-cf160689ff68" containerName="nova-scheduler-scheduler" containerID="cri-o://f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233" gracePeriod=30 Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.710243 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.710584 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-log" containerID="cri-o://206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d" gracePeriod=30 Dec 17 09:27:19 crc kubenswrapper[4935]: I1217 09:27:19.710733 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-metadata" containerID="cri-o://44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980" gracePeriod=30 Dec 17 09:27:20 crc kubenswrapper[4935]: I1217 09:27:20.409652 4935 generic.go:334] "Generic (PLEG): container finished" podID="59489e01-2f78-426d-b617-be609f6538f2" containerID="206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d" exitCode=143 Dec 17 09:27:20 crc kubenswrapper[4935]: I1217 09:27:20.409718 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59489e01-2f78-426d-b617-be609f6538f2","Type":"ContainerDied","Data":"206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d"} Dec 17 09:27:20 crc kubenswrapper[4935]: I1217 09:27:20.411572 4935 generic.go:334] "Generic (PLEG): container finished" podID="0c55b389-0049-49a0-a305-e3118ea777da" containerID="5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6" exitCode=143 Dec 17 09:27:20 crc kubenswrapper[4935]: I1217 09:27:20.411600 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c55b389-0049-49a0-a305-e3118ea777da","Type":"ContainerDied","Data":"5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6"} Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.074327 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.200081 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t746p\" (UniqueName: \"kubernetes.io/projected/44959a00-28e4-4914-8596-cf160689ff68-kube-api-access-t746p\") pod \"44959a00-28e4-4914-8596-cf160689ff68\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.200431 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-config-data\") pod \"44959a00-28e4-4914-8596-cf160689ff68\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.200522 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-combined-ca-bundle\") pod \"44959a00-28e4-4914-8596-cf160689ff68\" (UID: \"44959a00-28e4-4914-8596-cf160689ff68\") " Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.207533 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44959a00-28e4-4914-8596-cf160689ff68-kube-api-access-t746p" (OuterVolumeSpecName: "kube-api-access-t746p") pod "44959a00-28e4-4914-8596-cf160689ff68" (UID: "44959a00-28e4-4914-8596-cf160689ff68"). InnerVolumeSpecName "kube-api-access-t746p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.232601 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44959a00-28e4-4914-8596-cf160689ff68" (UID: "44959a00-28e4-4914-8596-cf160689ff68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.236645 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-config-data" (OuterVolumeSpecName: "config-data") pod "44959a00-28e4-4914-8596-cf160689ff68" (UID: "44959a00-28e4-4914-8596-cf160689ff68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.302669 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.303082 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t746p\" (UniqueName: \"kubernetes.io/projected/44959a00-28e4-4914-8596-cf160689ff68-kube-api-access-t746p\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.303094 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44959a00-28e4-4914-8596-cf160689ff68-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.425920 4935 generic.go:334] "Generic (PLEG): container finished" podID="44959a00-28e4-4914-8596-cf160689ff68" containerID="f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233" exitCode=0 Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.425974 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"44959a00-28e4-4914-8596-cf160689ff68","Type":"ContainerDied","Data":"f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233"} Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.425992 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.426007 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"44959a00-28e4-4914-8596-cf160689ff68","Type":"ContainerDied","Data":"607429fc6d1cb88556972b0a83e0a41615e76445c479ac4fffb1408aeaca95e4"} Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.426026 4935 scope.go:117] "RemoveContainer" containerID="f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.460937 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.461440 4935 scope.go:117] "RemoveContainer" containerID="f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233" Dec 17 09:27:21 crc kubenswrapper[4935]: E1217 09:27:21.462066 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233\": container with ID starting with f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233 not found: ID does not exist" containerID="f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.462114 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233"} err="failed to get container status \"f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233\": rpc error: code = NotFound desc = could not find container \"f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233\": container with ID starting with f2bfbf6632a334d4ea84963f0c29126525fc9151147751f15f698936adcc0233 not found: ID does not exist" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.472624 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.490836 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:27:21 crc kubenswrapper[4935]: E1217 09:27:21.491355 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cf002b-f718-4ada-9094-0484202077cb" containerName="dnsmasq-dns" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.491382 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cf002b-f718-4ada-9094-0484202077cb" containerName="dnsmasq-dns" Dec 17 09:27:21 crc kubenswrapper[4935]: E1217 09:27:21.491401 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5230b826-60b1-4db8-a7a0-e63c356fcc72" containerName="nova-manage" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.491411 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5230b826-60b1-4db8-a7a0-e63c356fcc72" containerName="nova-manage" Dec 17 09:27:21 crc kubenswrapper[4935]: E1217 09:27:21.491425 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44959a00-28e4-4914-8596-cf160689ff68" containerName="nova-scheduler-scheduler" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.491434 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="44959a00-28e4-4914-8596-cf160689ff68" containerName="nova-scheduler-scheduler" Dec 17 09:27:21 crc kubenswrapper[4935]: E1217 09:27:21.491458 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cf002b-f718-4ada-9094-0484202077cb" containerName="init" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.491467 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cf002b-f718-4ada-9094-0484202077cb" containerName="init" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.491735 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="44959a00-28e4-4914-8596-cf160689ff68" containerName="nova-scheduler-scheduler" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.491764 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5230b826-60b1-4db8-a7a0-e63c356fcc72" containerName="nova-manage" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.491799 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="88cf002b-f718-4ada-9094-0484202077cb" containerName="dnsmasq-dns" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.492958 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.496656 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.504810 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.610073 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0706ae-c551-44cd-8fa7-ac4e2da28664-config-data\") pod \"nova-scheduler-0\" (UID: \"9a0706ae-c551-44cd-8fa7-ac4e2da28664\") " pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.610136 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g8jr\" (UniqueName: \"kubernetes.io/projected/9a0706ae-c551-44cd-8fa7-ac4e2da28664-kube-api-access-9g8jr\") pod \"nova-scheduler-0\" (UID: \"9a0706ae-c551-44cd-8fa7-ac4e2da28664\") " pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.610662 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0706ae-c551-44cd-8fa7-ac4e2da28664-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9a0706ae-c551-44cd-8fa7-ac4e2da28664\") " pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.713225 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0706ae-c551-44cd-8fa7-ac4e2da28664-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9a0706ae-c551-44cd-8fa7-ac4e2da28664\") " pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.713384 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0706ae-c551-44cd-8fa7-ac4e2da28664-config-data\") pod \"nova-scheduler-0\" (UID: \"9a0706ae-c551-44cd-8fa7-ac4e2da28664\") " pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.713412 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g8jr\" (UniqueName: \"kubernetes.io/projected/9a0706ae-c551-44cd-8fa7-ac4e2da28664-kube-api-access-9g8jr\") pod \"nova-scheduler-0\" (UID: \"9a0706ae-c551-44cd-8fa7-ac4e2da28664\") " pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.719149 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a0706ae-c551-44cd-8fa7-ac4e2da28664-config-data\") pod \"nova-scheduler-0\" (UID: \"9a0706ae-c551-44cd-8fa7-ac4e2da28664\") " pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.721934 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a0706ae-c551-44cd-8fa7-ac4e2da28664-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9a0706ae-c551-44cd-8fa7-ac4e2da28664\") " pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.730251 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g8jr\" (UniqueName: \"kubernetes.io/projected/9a0706ae-c551-44cd-8fa7-ac4e2da28664-kube-api-access-9g8jr\") pod \"nova-scheduler-0\" (UID: \"9a0706ae-c551-44cd-8fa7-ac4e2da28664\") " pod="openstack/nova-scheduler-0" Dec 17 09:27:21 crc kubenswrapper[4935]: I1217 09:27:21.819771 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 17 09:27:22 crc kubenswrapper[4935]: I1217 09:27:22.314082 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 17 09:27:22 crc kubenswrapper[4935]: I1217 09:27:22.440168 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9a0706ae-c551-44cd-8fa7-ac4e2da28664","Type":"ContainerStarted","Data":"7e56586d2e160288d9ee1e1e7bcf02b32ca081a4cb91e6de929cb40410f3a9cc"} Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.010028 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": dial tcp 10.217.0.191:8775: connect: connection refused" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.010170 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": dial tcp 10.217.0.191:8775: connect: connection refused" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.136614 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44959a00-28e4-4914-8596-cf160689ff68" path="/var/lib/kubelet/pods/44959a00-28e4-4914-8596-cf160689ff68/volumes" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.354526 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.460392 4935 generic.go:334] "Generic (PLEG): container finished" podID="59489e01-2f78-426d-b617-be609f6538f2" containerID="44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980" exitCode=0 Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.460504 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59489e01-2f78-426d-b617-be609f6538f2","Type":"ContainerDied","Data":"44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980"} Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.460545 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"59489e01-2f78-426d-b617-be609f6538f2","Type":"ContainerDied","Data":"da5923fa99a2ef871a3bec99b1acc03a8e3bb6e2309751f4d04203f52a69c8e0"} Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.460570 4935 scope.go:117] "RemoveContainer" containerID="44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.460655 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.464612 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9a0706ae-c551-44cd-8fa7-ac4e2da28664","Type":"ContainerStarted","Data":"cc9ff15831e46d9fca0ad4258067cf91071befd27f7c71ed83606583b835c659"} Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.501201 4935 scope.go:117] "RemoveContainer" containerID="206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.504430 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.504403564 podStartE2EDuration="2.504403564s" podCreationTimestamp="2025-12-17 09:27:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:27:23.482563035 +0000 UTC m=+1363.142403798" watchObservedRunningTime="2025-12-17 09:27:23.504403564 +0000 UTC m=+1363.164244357" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.522756 4935 scope.go:117] "RemoveContainer" containerID="44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980" Dec 17 09:27:23 crc kubenswrapper[4935]: E1217 09:27:23.523633 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980\": container with ID starting with 44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980 not found: ID does not exist" containerID="44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.523694 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980"} err="failed to get container status \"44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980\": rpc error: code = NotFound desc = could not find container \"44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980\": container with ID starting with 44455f46a402eaf9ae9a7b9b180271f679d187a2308baa246618d8504ec20980 not found: ID does not exist" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.523730 4935 scope.go:117] "RemoveContainer" containerID="206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d" Dec 17 09:27:23 crc kubenswrapper[4935]: E1217 09:27:23.524353 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d\": container with ID starting with 206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d not found: ID does not exist" containerID="206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.524413 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d"} err="failed to get container status \"206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d\": rpc error: code = NotFound desc = could not find container \"206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d\": container with ID starting with 206122467bb7849c546f7f7252e3791113a1efb51df90876f5c71ce354d2274d not found: ID does not exist" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.569222 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-nova-metadata-tls-certs\") pod \"59489e01-2f78-426d-b617-be609f6538f2\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.569396 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-combined-ca-bundle\") pod \"59489e01-2f78-426d-b617-be609f6538f2\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.569655 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59489e01-2f78-426d-b617-be609f6538f2-logs\") pod \"59489e01-2f78-426d-b617-be609f6538f2\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.569685 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxxlp\" (UniqueName: \"kubernetes.io/projected/59489e01-2f78-426d-b617-be609f6538f2-kube-api-access-wxxlp\") pod \"59489e01-2f78-426d-b617-be609f6538f2\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.569821 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-config-data\") pod \"59489e01-2f78-426d-b617-be609f6538f2\" (UID: \"59489e01-2f78-426d-b617-be609f6538f2\") " Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.571571 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59489e01-2f78-426d-b617-be609f6538f2-logs" (OuterVolumeSpecName: "logs") pod "59489e01-2f78-426d-b617-be609f6538f2" (UID: "59489e01-2f78-426d-b617-be609f6538f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.582710 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59489e01-2f78-426d-b617-be609f6538f2-kube-api-access-wxxlp" (OuterVolumeSpecName: "kube-api-access-wxxlp") pod "59489e01-2f78-426d-b617-be609f6538f2" (UID: "59489e01-2f78-426d-b617-be609f6538f2"). InnerVolumeSpecName "kube-api-access-wxxlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.602410 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-config-data" (OuterVolumeSpecName: "config-data") pod "59489e01-2f78-426d-b617-be609f6538f2" (UID: "59489e01-2f78-426d-b617-be609f6538f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.606917 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59489e01-2f78-426d-b617-be609f6538f2" (UID: "59489e01-2f78-426d-b617-be609f6538f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.634660 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "59489e01-2f78-426d-b617-be609f6538f2" (UID: "59489e01-2f78-426d-b617-be609f6538f2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.673332 4935 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.673366 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.673378 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59489e01-2f78-426d-b617-be609f6538f2-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.673389 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxxlp\" (UniqueName: \"kubernetes.io/projected/59489e01-2f78-426d-b617-be609f6538f2-kube-api-access-wxxlp\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.673397 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59489e01-2f78-426d-b617-be609f6538f2-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.797357 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.818330 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.835052 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:27:23 crc kubenswrapper[4935]: E1217 09:27:23.835680 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-metadata" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.835702 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-metadata" Dec 17 09:27:23 crc kubenswrapper[4935]: E1217 09:27:23.835728 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-log" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.835751 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-log" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.835971 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-metadata" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.836012 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="59489e01-2f78-426d-b617-be609f6538f2" containerName="nova-metadata-log" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.837325 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.840108 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.840458 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.854037 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.990625 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8366a42-0c62-4527-b173-f7bfdbd2223a-config-data\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.990894 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8366a42-0c62-4527-b173-f7bfdbd2223a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.990946 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8366a42-0c62-4527-b173-f7bfdbd2223a-logs\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.990971 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gwnd\" (UniqueName: \"kubernetes.io/projected/b8366a42-0c62-4527-b173-f7bfdbd2223a-kube-api-access-6gwnd\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:23 crc kubenswrapper[4935]: I1217 09:27:23.991308 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8366a42-0c62-4527-b173-f7bfdbd2223a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.094245 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8366a42-0c62-4527-b173-f7bfdbd2223a-config-data\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.094423 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8366a42-0c62-4527-b173-f7bfdbd2223a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.094502 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8366a42-0c62-4527-b173-f7bfdbd2223a-logs\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.095548 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8366a42-0c62-4527-b173-f7bfdbd2223a-logs\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.094551 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gwnd\" (UniqueName: \"kubernetes.io/projected/b8366a42-0c62-4527-b173-f7bfdbd2223a-kube-api-access-6gwnd\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.095807 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8366a42-0c62-4527-b173-f7bfdbd2223a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.101876 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8366a42-0c62-4527-b173-f7bfdbd2223a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.103389 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8366a42-0c62-4527-b173-f7bfdbd2223a-config-data\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.104178 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8366a42-0c62-4527-b173-f7bfdbd2223a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.115975 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gwnd\" (UniqueName: \"kubernetes.io/projected/b8366a42-0c62-4527-b173-f7bfdbd2223a-kube-api-access-6gwnd\") pod \"nova-metadata-0\" (UID: \"b8366a42-0c62-4527-b173-f7bfdbd2223a\") " pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.174093 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.472860 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.478879 4935 generic.go:334] "Generic (PLEG): container finished" podID="0c55b389-0049-49a0-a305-e3118ea777da" containerID="335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280" exitCode=0 Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.479534 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.479537 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c55b389-0049-49a0-a305-e3118ea777da","Type":"ContainerDied","Data":"335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280"} Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.479636 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0c55b389-0049-49a0-a305-e3118ea777da","Type":"ContainerDied","Data":"e06d8070f5e9b0ff0876d25d4e22de2b473dbcae40ec4678355218e342e0f087"} Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.479711 4935 scope.go:117] "RemoveContainer" containerID="335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.511953 4935 scope.go:117] "RemoveContainer" containerID="5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.530972 4935 scope.go:117] "RemoveContainer" containerID="335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280" Dec 17 09:27:24 crc kubenswrapper[4935]: E1217 09:27:24.531480 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280\": container with ID starting with 335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280 not found: ID does not exist" containerID="335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.531526 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280"} err="failed to get container status \"335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280\": rpc error: code = NotFound desc = could not find container \"335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280\": container with ID starting with 335ea4e0f83de3bfeb508914dfc756afec74b402c8fc107741c2ad6f3025a280 not found: ID does not exist" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.531557 4935 scope.go:117] "RemoveContainer" containerID="5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6" Dec 17 09:27:24 crc kubenswrapper[4935]: E1217 09:27:24.531921 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6\": container with ID starting with 5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6 not found: ID does not exist" containerID="5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.531972 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6"} err="failed to get container status \"5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6\": rpc error: code = NotFound desc = could not find container \"5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6\": container with ID starting with 5b8e3d242c03efdc1660cb2b952d6d5b7f2b6a29e76a477d57b44a9880b447b6 not found: ID does not exist" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.613620 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c55b389-0049-49a0-a305-e3118ea777da-logs\") pod \"0c55b389-0049-49a0-a305-e3118ea777da\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.613715 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-internal-tls-certs\") pod \"0c55b389-0049-49a0-a305-e3118ea777da\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.613797 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-combined-ca-bundle\") pod \"0c55b389-0049-49a0-a305-e3118ea777da\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.613830 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-config-data\") pod \"0c55b389-0049-49a0-a305-e3118ea777da\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.614078 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkkn2\" (UniqueName: \"kubernetes.io/projected/0c55b389-0049-49a0-a305-e3118ea777da-kube-api-access-kkkn2\") pod \"0c55b389-0049-49a0-a305-e3118ea777da\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.614304 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-public-tls-certs\") pod \"0c55b389-0049-49a0-a305-e3118ea777da\" (UID: \"0c55b389-0049-49a0-a305-e3118ea777da\") " Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.615137 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c55b389-0049-49a0-a305-e3118ea777da-logs" (OuterVolumeSpecName: "logs") pod "0c55b389-0049-49a0-a305-e3118ea777da" (UID: "0c55b389-0049-49a0-a305-e3118ea777da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.615842 4935 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c55b389-0049-49a0-a305-e3118ea777da-logs\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.620164 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c55b389-0049-49a0-a305-e3118ea777da-kube-api-access-kkkn2" (OuterVolumeSpecName: "kube-api-access-kkkn2") pod "0c55b389-0049-49a0-a305-e3118ea777da" (UID: "0c55b389-0049-49a0-a305-e3118ea777da"). InnerVolumeSpecName "kube-api-access-kkkn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.649726 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-config-data" (OuterVolumeSpecName: "config-data") pod "0c55b389-0049-49a0-a305-e3118ea777da" (UID: "0c55b389-0049-49a0-a305-e3118ea777da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.654282 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c55b389-0049-49a0-a305-e3118ea777da" (UID: "0c55b389-0049-49a0-a305-e3118ea777da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.675870 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0c55b389-0049-49a0-a305-e3118ea777da" (UID: "0c55b389-0049-49a0-a305-e3118ea777da"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.676853 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0c55b389-0049-49a0-a305-e3118ea777da" (UID: "0c55b389-0049-49a0-a305-e3118ea777da"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.718690 4935 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.718768 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.718784 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.718796 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkkn2\" (UniqueName: \"kubernetes.io/projected/0c55b389-0049-49a0-a305-e3118ea777da-kube-api-access-kkkn2\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.718811 4935 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c55b389-0049-49a0-a305-e3118ea777da-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.720471 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.847685 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.866127 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.879545 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:24 crc kubenswrapper[4935]: E1217 09:27:24.880044 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c55b389-0049-49a0-a305-e3118ea777da" containerName="nova-api-api" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.880064 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c55b389-0049-49a0-a305-e3118ea777da" containerName="nova-api-api" Dec 17 09:27:24 crc kubenswrapper[4935]: E1217 09:27:24.880111 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c55b389-0049-49a0-a305-e3118ea777da" containerName="nova-api-log" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.880119 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c55b389-0049-49a0-a305-e3118ea777da" containerName="nova-api-log" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.880392 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c55b389-0049-49a0-a305-e3118ea777da" containerName="nova-api-api" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.880412 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c55b389-0049-49a0-a305-e3118ea777da" containerName="nova-api-log" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.881541 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.885449 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.885876 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.886077 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 17 09:27:24 crc kubenswrapper[4935]: I1217 09:27:24.892688 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.030431 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb5ad70-4355-4513-8420-a2e99ea5a3be-logs\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.030939 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.031019 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rrcn\" (UniqueName: \"kubernetes.io/projected/bbb5ad70-4355-4513-8420-a2e99ea5a3be-kube-api-access-2rrcn\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.031091 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-public-tls-certs\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.031150 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-config-data\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.031179 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.133156 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-public-tls-certs\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.133249 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-config-data\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.133319 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.135881 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb5ad70-4355-4513-8420-a2e99ea5a3be-logs\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.135915 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.136191 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c55b389-0049-49a0-a305-e3118ea777da" path="/var/lib/kubelet/pods/0c55b389-0049-49a0-a305-e3118ea777da/volumes" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.136375 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bbb5ad70-4355-4513-8420-a2e99ea5a3be-logs\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.136537 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rrcn\" (UniqueName: \"kubernetes.io/projected/bbb5ad70-4355-4513-8420-a2e99ea5a3be-kube-api-access-2rrcn\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.137501 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59489e01-2f78-426d-b617-be609f6538f2" path="/var/lib/kubelet/pods/59489e01-2f78-426d-b617-be609f6538f2/volumes" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.138430 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.138956 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-config-data\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.139413 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-public-tls-certs\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.141990 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb5ad70-4355-4513-8420-a2e99ea5a3be-internal-tls-certs\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.154577 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rrcn\" (UniqueName: \"kubernetes.io/projected/bbb5ad70-4355-4513-8420-a2e99ea5a3be-kube-api-access-2rrcn\") pod \"nova-api-0\" (UID: \"bbb5ad70-4355-4513-8420-a2e99ea5a3be\") " pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.206989 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.492732 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b8366a42-0c62-4527-b173-f7bfdbd2223a","Type":"ContainerStarted","Data":"6079ad625a8483c7016cb237282d552e6031f4adfb69de195f231fd80724c4b6"} Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.493140 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b8366a42-0c62-4527-b173-f7bfdbd2223a","Type":"ContainerStarted","Data":"8c8156dbf292435b355a20dfe62ece2b3d7d8a36a0ff90ea24dc5345ef13df95"} Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.493152 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b8366a42-0c62-4527-b173-f7bfdbd2223a","Type":"ContainerStarted","Data":"57978e7447af373817a3727df00c83868d059545863ecd2937f4307d5ff15e21"} Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.524865 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.524826013 podStartE2EDuration="2.524826013s" podCreationTimestamp="2025-12-17 09:27:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:27:25.514226067 +0000 UTC m=+1365.174066850" watchObservedRunningTime="2025-12-17 09:27:25.524826013 +0000 UTC m=+1365.184666776" Dec 17 09:27:25 crc kubenswrapper[4935]: I1217 09:27:25.678320 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 17 09:27:25 crc kubenswrapper[4935]: W1217 09:27:25.678703 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbb5ad70_4355_4513_8420_a2e99ea5a3be.slice/crio-23616d260b861e92de099a25d211831169ebb943c177f4269f0da1e8b54601d5 WatchSource:0}: Error finding container 23616d260b861e92de099a25d211831169ebb943c177f4269f0da1e8b54601d5: Status 404 returned error can't find the container with id 23616d260b861e92de099a25d211831169ebb943c177f4269f0da1e8b54601d5 Dec 17 09:27:26 crc kubenswrapper[4935]: I1217 09:27:26.503262 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbb5ad70-4355-4513-8420-a2e99ea5a3be","Type":"ContainerStarted","Data":"7f86380950ed8c4235e9b6d3e580a0918ce0670d6e8a59fac088a9e84461f761"} Dec 17 09:27:26 crc kubenswrapper[4935]: I1217 09:27:26.503707 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbb5ad70-4355-4513-8420-a2e99ea5a3be","Type":"ContainerStarted","Data":"8577fd97face0ac17d8feb024fd0b7bb428c2dba1de299685d22eac32490bc11"} Dec 17 09:27:26 crc kubenswrapper[4935]: I1217 09:27:26.503720 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bbb5ad70-4355-4513-8420-a2e99ea5a3be","Type":"ContainerStarted","Data":"23616d260b861e92de099a25d211831169ebb943c177f4269f0da1e8b54601d5"} Dec 17 09:27:26 crc kubenswrapper[4935]: I1217 09:27:26.529120 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.5290942899999997 podStartE2EDuration="2.52909429s" podCreationTimestamp="2025-12-17 09:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:27:26.527426499 +0000 UTC m=+1366.187267262" watchObservedRunningTime="2025-12-17 09:27:26.52909429 +0000 UTC m=+1366.188935053" Dec 17 09:27:26 crc kubenswrapper[4935]: I1217 09:27:26.820909 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 17 09:27:29 crc kubenswrapper[4935]: I1217 09:27:29.175177 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 17 09:27:29 crc kubenswrapper[4935]: I1217 09:27:29.175743 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 17 09:27:30 crc kubenswrapper[4935]: I1217 09:27:30.130160 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:27:30 crc kubenswrapper[4935]: I1217 09:27:30.130649 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:27:31 crc kubenswrapper[4935]: I1217 09:27:31.820237 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 17 09:27:31 crc kubenswrapper[4935]: I1217 09:27:31.858929 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 17 09:27:32 crc kubenswrapper[4935]: I1217 09:27:32.598348 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 17 09:27:34 crc kubenswrapper[4935]: I1217 09:27:34.175417 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 17 09:27:34 crc kubenswrapper[4935]: I1217 09:27:34.176039 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 17 09:27:35 crc kubenswrapper[4935]: I1217 09:27:35.195600 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b8366a42-0c62-4527-b173-f7bfdbd2223a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 17 09:27:35 crc kubenswrapper[4935]: I1217 09:27:35.195667 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b8366a42-0c62-4527-b173-f7bfdbd2223a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 17 09:27:35 crc kubenswrapper[4935]: I1217 09:27:35.207592 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 17 09:27:35 crc kubenswrapper[4935]: I1217 09:27:35.207702 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 17 09:27:35 crc kubenswrapper[4935]: I1217 09:27:35.521208 4935 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Dec 17 09:27:36 crc kubenswrapper[4935]: I1217 09:27:36.228707 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bbb5ad70-4355-4513-8420-a2e99ea5a3be" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 17 09:27:36 crc kubenswrapper[4935]: I1217 09:27:36.228708 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bbb5ad70-4355-4513-8420-a2e99ea5a3be" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.685041 4935 generic.go:334] "Generic (PLEG): container finished" podID="5d8b192d-6898-4e6a-b442-63985cf34098" containerID="2e581075e1d1ff9f2f248f8ca217cfc19fc71618b2bb5c97e986069300ff193a" exitCode=137 Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.685143 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d8b192d-6898-4e6a-b442-63985cf34098","Type":"ContainerDied","Data":"2e581075e1d1ff9f2f248f8ca217cfc19fc71618b2bb5c97e986069300ff193a"} Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.685942 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5d8b192d-6898-4e6a-b442-63985cf34098","Type":"ContainerDied","Data":"469cc48d80d25b62be8c729d1669e662c6462c18c298116c9a12c245b824f3a9"} Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.685963 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="469cc48d80d25b62be8c729d1669e662c6462c18c298116c9a12c245b824f3a9" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.698179 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.728205 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-sg-core-conf-yaml\") pod \"5d8b192d-6898-4e6a-b442-63985cf34098\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.728353 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-scripts\") pod \"5d8b192d-6898-4e6a-b442-63985cf34098\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.728455 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-log-httpd\") pod \"5d8b192d-6898-4e6a-b442-63985cf34098\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.728512 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-run-httpd\") pod \"5d8b192d-6898-4e6a-b442-63985cf34098\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.728558 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-ceilometer-tls-certs\") pod \"5d8b192d-6898-4e6a-b442-63985cf34098\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.728682 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8rxd\" (UniqueName: \"kubernetes.io/projected/5d8b192d-6898-4e6a-b442-63985cf34098-kube-api-access-w8rxd\") pod \"5d8b192d-6898-4e6a-b442-63985cf34098\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.728730 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-config-data\") pod \"5d8b192d-6898-4e6a-b442-63985cf34098\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.728780 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-combined-ca-bundle\") pod \"5d8b192d-6898-4e6a-b442-63985cf34098\" (UID: \"5d8b192d-6898-4e6a-b442-63985cf34098\") " Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.730071 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5d8b192d-6898-4e6a-b442-63985cf34098" (UID: "5d8b192d-6898-4e6a-b442-63985cf34098"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.730656 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5d8b192d-6898-4e6a-b442-63985cf34098" (UID: "5d8b192d-6898-4e6a-b442-63985cf34098"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.742937 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-scripts" (OuterVolumeSpecName: "scripts") pod "5d8b192d-6898-4e6a-b442-63985cf34098" (UID: "5d8b192d-6898-4e6a-b442-63985cf34098"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.746429 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8b192d-6898-4e6a-b442-63985cf34098-kube-api-access-w8rxd" (OuterVolumeSpecName: "kube-api-access-w8rxd") pod "5d8b192d-6898-4e6a-b442-63985cf34098" (UID: "5d8b192d-6898-4e6a-b442-63985cf34098"). InnerVolumeSpecName "kube-api-access-w8rxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.772010 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "5d8b192d-6898-4e6a-b442-63985cf34098" (UID: "5d8b192d-6898-4e6a-b442-63985cf34098"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.790952 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "5d8b192d-6898-4e6a-b442-63985cf34098" (UID: "5d8b192d-6898-4e6a-b442-63985cf34098"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.832213 4935 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.832258 4935 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-scripts\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.832275 4935 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.832366 4935 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d8b192d-6898-4e6a-b442-63985cf34098-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.832378 4935 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.832390 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8rxd\" (UniqueName: \"kubernetes.io/projected/5d8b192d-6898-4e6a-b442-63985cf34098-kube-api-access-w8rxd\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.834029 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d8b192d-6898-4e6a-b442-63985cf34098" (UID: "5d8b192d-6898-4e6a-b442-63985cf34098"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.852631 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-config-data" (OuterVolumeSpecName: "config-data") pod "5d8b192d-6898-4e6a-b442-63985cf34098" (UID: "5d8b192d-6898-4e6a-b442-63985cf34098"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.936050 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:41 crc kubenswrapper[4935]: I1217 09:27:41.936103 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d8b192d-6898-4e6a-b442-63985cf34098-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.695609 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.736330 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.747360 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.768392 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:42 crc kubenswrapper[4935]: E1217 09:27:42.768963 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="ceilometer-central-agent" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.768993 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="ceilometer-central-agent" Dec 17 09:27:42 crc kubenswrapper[4935]: E1217 09:27:42.769012 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="ceilometer-notification-agent" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.769022 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="ceilometer-notification-agent" Dec 17 09:27:42 crc kubenswrapper[4935]: E1217 09:27:42.769067 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="sg-core" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.769077 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="sg-core" Dec 17 09:27:42 crc kubenswrapper[4935]: E1217 09:27:42.769103 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="proxy-httpd" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.769112 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="proxy-httpd" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.769519 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="ceilometer-notification-agent" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.769552 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="ceilometer-central-agent" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.769570 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="proxy-httpd" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.769589 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" containerName="sg-core" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.773327 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.871147 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.871521 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.871650 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.874985 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.970542 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-scripts\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.970622 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.970749 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-config-data\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.970806 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.970826 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9987t\" (UniqueName: \"kubernetes.io/projected/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-kube-api-access-9987t\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.970847 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.971016 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-run-httpd\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:42 crc kubenswrapper[4935]: I1217 09:27:42.971149 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-log-httpd\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.074628 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.074702 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9987t\" (UniqueName: \"kubernetes.io/projected/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-kube-api-access-9987t\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.074745 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.074787 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-run-httpd\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.074838 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-log-httpd\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.074930 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-scripts\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.074979 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.075118 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-config-data\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.076680 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-log-httpd\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.076989 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-run-httpd\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.081261 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.081396 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-scripts\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.084165 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-config-data\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.084821 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.086266 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.100142 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9987t\" (UniqueName: \"kubernetes.io/projected/7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053-kube-api-access-9987t\") pod \"ceilometer-0\" (UID: \"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053\") " pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.137613 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d8b192d-6898-4e6a-b442-63985cf34098" path="/var/lib/kubelet/pods/5d8b192d-6898-4e6a-b442-63985cf34098/volumes" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.190069 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.652728 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 17 09:27:43 crc kubenswrapper[4935]: W1217 09:27:43.657079 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d6f03f3_bf3c_478b_83fa_e1d1d2f0f053.slice/crio-be017c66dc885e30ff3770929ed3131f3a60833456f7c683cb1c13a7a2f3126d WatchSource:0}: Error finding container be017c66dc885e30ff3770929ed3131f3a60833456f7c683cb1c13a7a2f3126d: Status 404 returned error can't find the container with id be017c66dc885e30ff3770929ed3131f3a60833456f7c683cb1c13a7a2f3126d Dec 17 09:27:43 crc kubenswrapper[4935]: I1217 09:27:43.710468 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053","Type":"ContainerStarted","Data":"be017c66dc885e30ff3770929ed3131f3a60833456f7c683cb1c13a7a2f3126d"} Dec 17 09:27:44 crc kubenswrapper[4935]: I1217 09:27:44.187112 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 17 09:27:44 crc kubenswrapper[4935]: I1217 09:27:44.193087 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 17 09:27:44 crc kubenswrapper[4935]: I1217 09:27:44.195027 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 17 09:27:44 crc kubenswrapper[4935]: I1217 09:27:44.730895 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053","Type":"ContainerStarted","Data":"0c75f50f698ca20924ad189bc0f49bfa6f2d4309b6252714e641e49d8a71ff6c"} Dec 17 09:27:44 crc kubenswrapper[4935]: I1217 09:27:44.739331 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 17 09:27:45 crc kubenswrapper[4935]: I1217 09:27:45.230429 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 17 09:27:45 crc kubenswrapper[4935]: I1217 09:27:45.231482 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 17 09:27:45 crc kubenswrapper[4935]: I1217 09:27:45.245405 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 17 09:27:45 crc kubenswrapper[4935]: I1217 09:27:45.262395 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 17 09:27:45 crc kubenswrapper[4935]: I1217 09:27:45.748350 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053","Type":"ContainerStarted","Data":"07a26b39c2217b7052305726f429e6b00dd863ddd5f32a9f86fa6cba4da2af52"} Dec 17 09:27:45 crc kubenswrapper[4935]: I1217 09:27:45.749734 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 17 09:27:45 crc kubenswrapper[4935]: I1217 09:27:45.760681 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 17 09:27:46 crc kubenswrapper[4935]: I1217 09:27:46.764313 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053","Type":"ContainerStarted","Data":"dd6f7716cfd2762391b8d239a2eaedcd493783d4ff652137855ed6ec3427b580"} Dec 17 09:27:48 crc kubenswrapper[4935]: I1217 09:27:48.789899 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053","Type":"ContainerStarted","Data":"15887dac6adb2f05111874b5a148445da11b08357be3eb2e40415bbe125436ba"} Dec 17 09:27:48 crc kubenswrapper[4935]: I1217 09:27:48.790743 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 17 09:27:48 crc kubenswrapper[4935]: I1217 09:27:48.827698 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.80170849 podStartE2EDuration="6.827668909s" podCreationTimestamp="2025-12-17 09:27:42 +0000 UTC" firstStartedPulling="2025-12-17 09:27:43.66114541 +0000 UTC m=+1383.320986173" lastFinishedPulling="2025-12-17 09:27:47.687105789 +0000 UTC m=+1387.346946592" observedRunningTime="2025-12-17 09:27:48.811071096 +0000 UTC m=+1388.470911869" watchObservedRunningTime="2025-12-17 09:27:48.827668909 +0000 UTC m=+1388.487509682" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.352833 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k572c"] Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.356649 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.373840 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k572c"] Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.464061 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-utilities\") pod \"redhat-operators-k572c\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.464633 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn7s2\" (UniqueName: \"kubernetes.io/projected/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-kube-api-access-pn7s2\") pod \"redhat-operators-k572c\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.464746 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-catalog-content\") pod \"redhat-operators-k572c\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.566979 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-catalog-content\") pod \"redhat-operators-k572c\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.567143 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-utilities\") pod \"redhat-operators-k572c\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.567243 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn7s2\" (UniqueName: \"kubernetes.io/projected/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-kube-api-access-pn7s2\") pod \"redhat-operators-k572c\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.567903 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-utilities\") pod \"redhat-operators-k572c\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.568347 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-catalog-content\") pod \"redhat-operators-k572c\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.594215 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn7s2\" (UniqueName: \"kubernetes.io/projected/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-kube-api-access-pn7s2\") pod \"redhat-operators-k572c\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:54 crc kubenswrapper[4935]: I1217 09:27:54.687978 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:27:55 crc kubenswrapper[4935]: I1217 09:27:55.154028 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k572c"] Dec 17 09:27:55 crc kubenswrapper[4935]: I1217 09:27:55.899300 4935 generic.go:334] "Generic (PLEG): container finished" podID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerID="16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9" exitCode=0 Dec 17 09:27:55 crc kubenswrapper[4935]: I1217 09:27:55.899802 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k572c" event={"ID":"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f","Type":"ContainerDied","Data":"16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9"} Dec 17 09:27:55 crc kubenswrapper[4935]: I1217 09:27:55.899839 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k572c" event={"ID":"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f","Type":"ContainerStarted","Data":"14ca3ee04842622c8beec9f1ab8ad0a5fe32a0729fc5d26cc4ffe16f1e47ca5f"} Dec 17 09:27:57 crc kubenswrapper[4935]: I1217 09:27:57.923833 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k572c" event={"ID":"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f","Type":"ContainerStarted","Data":"33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3"} Dec 17 09:27:58 crc kubenswrapper[4935]: I1217 09:27:58.938930 4935 generic.go:334] "Generic (PLEG): container finished" podID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerID="33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3" exitCode=0 Dec 17 09:27:58 crc kubenswrapper[4935]: I1217 09:27:58.939037 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k572c" event={"ID":"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f","Type":"ContainerDied","Data":"33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3"} Dec 17 09:28:00 crc kubenswrapper[4935]: I1217 09:28:00.131001 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:28:00 crc kubenswrapper[4935]: I1217 09:28:00.131085 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:28:00 crc kubenswrapper[4935]: I1217 09:28:00.131140 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:28:00 crc kubenswrapper[4935]: I1217 09:28:00.131885 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"65dd41ad94dd7bae1b7cbbd3c318eb23617db601045887b6ac1ed745fc1e5001"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:28:00 crc kubenswrapper[4935]: I1217 09:28:00.131971 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://65dd41ad94dd7bae1b7cbbd3c318eb23617db601045887b6ac1ed745fc1e5001" gracePeriod=600 Dec 17 09:28:01 crc kubenswrapper[4935]: I1217 09:28:01.330393 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="65dd41ad94dd7bae1b7cbbd3c318eb23617db601045887b6ac1ed745fc1e5001" exitCode=0 Dec 17 09:28:01 crc kubenswrapper[4935]: I1217 09:28:01.330576 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"65dd41ad94dd7bae1b7cbbd3c318eb23617db601045887b6ac1ed745fc1e5001"} Dec 17 09:28:01 crc kubenswrapper[4935]: I1217 09:28:01.331461 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9"} Dec 17 09:28:01 crc kubenswrapper[4935]: I1217 09:28:01.331493 4935 scope.go:117] "RemoveContainer" containerID="b2296c1b96eef7533eacbd1e8dfbd023ac687a2c9a2c2ca41bf8bcc87a90001d" Dec 17 09:28:01 crc kubenswrapper[4935]: I1217 09:28:01.340680 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k572c" event={"ID":"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f","Type":"ContainerStarted","Data":"0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011"} Dec 17 09:28:01 crc kubenswrapper[4935]: I1217 09:28:01.382706 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k572c" podStartSLOduration=3.213593252 podStartE2EDuration="7.382686291s" podCreationTimestamp="2025-12-17 09:27:54 +0000 UTC" firstStartedPulling="2025-12-17 09:27:55.90216356 +0000 UTC m=+1395.562004323" lastFinishedPulling="2025-12-17 09:28:00.071256559 +0000 UTC m=+1399.731097362" observedRunningTime="2025-12-17 09:28:01.374713277 +0000 UTC m=+1401.034554040" watchObservedRunningTime="2025-12-17 09:28:01.382686291 +0000 UTC m=+1401.042527054" Dec 17 09:28:04 crc kubenswrapper[4935]: I1217 09:28:04.688720 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:28:04 crc kubenswrapper[4935]: I1217 09:28:04.690422 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:28:05 crc kubenswrapper[4935]: I1217 09:28:05.737201 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k572c" podUID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerName="registry-server" probeResult="failure" output=< Dec 17 09:28:05 crc kubenswrapper[4935]: timeout: failed to connect service ":50051" within 1s Dec 17 09:28:05 crc kubenswrapper[4935]: > Dec 17 09:28:13 crc kubenswrapper[4935]: I1217 09:28:13.201679 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 17 09:28:14 crc kubenswrapper[4935]: I1217 09:28:14.759592 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:28:14 crc kubenswrapper[4935]: I1217 09:28:14.817385 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:28:15 crc kubenswrapper[4935]: I1217 09:28:15.004635 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k572c"] Dec 17 09:28:16 crc kubenswrapper[4935]: I1217 09:28:16.505092 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k572c" podUID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerName="registry-server" containerID="cri-o://0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011" gracePeriod=2 Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.155464 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.186394 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-utilities\") pod \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.187129 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-catalog-content\") pod \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.187245 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn7s2\" (UniqueName: \"kubernetes.io/projected/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-kube-api-access-pn7s2\") pod \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\" (UID: \"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f\") " Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.187674 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-utilities" (OuterVolumeSpecName: "utilities") pod "b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" (UID: "b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.190790 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.197347 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-kube-api-access-pn7s2" (OuterVolumeSpecName: "kube-api-access-pn7s2") pod "b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" (UID: "b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f"). InnerVolumeSpecName "kube-api-access-pn7s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.292794 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn7s2\" (UniqueName: \"kubernetes.io/projected/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-kube-api-access-pn7s2\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.317363 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" (UID: "b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.395078 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.523083 4935 generic.go:334] "Generic (PLEG): container finished" podID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerID="0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011" exitCode=0 Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.523138 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k572c" event={"ID":"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f","Type":"ContainerDied","Data":"0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011"} Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.523178 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k572c" event={"ID":"b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f","Type":"ContainerDied","Data":"14ca3ee04842622c8beec9f1ab8ad0a5fe32a0729fc5d26cc4ffe16f1e47ca5f"} Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.523185 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k572c" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.523198 4935 scope.go:117] "RemoveContainer" containerID="0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.551830 4935 scope.go:117] "RemoveContainer" containerID="33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.570755 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k572c"] Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.578100 4935 scope.go:117] "RemoveContainer" containerID="16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.583779 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k572c"] Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.621704 4935 scope.go:117] "RemoveContainer" containerID="0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011" Dec 17 09:28:17 crc kubenswrapper[4935]: E1217 09:28:17.622405 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011\": container with ID starting with 0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011 not found: ID does not exist" containerID="0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.622446 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011"} err="failed to get container status \"0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011\": rpc error: code = NotFound desc = could not find container \"0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011\": container with ID starting with 0618eefbb6aa412e4e5db0c4eb46e82ea34bfd49bb02929f18c5db6622aa6011 not found: ID does not exist" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.622480 4935 scope.go:117] "RemoveContainer" containerID="33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3" Dec 17 09:28:17 crc kubenswrapper[4935]: E1217 09:28:17.622925 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3\": container with ID starting with 33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3 not found: ID does not exist" containerID="33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.623003 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3"} err="failed to get container status \"33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3\": rpc error: code = NotFound desc = could not find container \"33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3\": container with ID starting with 33df80862fc8a18320c136a2dc8bb60fd0e6f837db9a6ceebef4c46dd360c0f3 not found: ID does not exist" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.623058 4935 scope.go:117] "RemoveContainer" containerID="16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9" Dec 17 09:28:17 crc kubenswrapper[4935]: E1217 09:28:17.623447 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9\": container with ID starting with 16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9 not found: ID does not exist" containerID="16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9" Dec 17 09:28:17 crc kubenswrapper[4935]: I1217 09:28:17.623478 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9"} err="failed to get container status \"16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9\": rpc error: code = NotFound desc = could not find container \"16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9\": container with ID starting with 16b218ab1d3568ce34e77767758245b8257bf54da7377883154ed15e358cd3e9 not found: ID does not exist" Dec 17 09:28:19 crc kubenswrapper[4935]: I1217 09:28:19.146304 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" path="/var/lib/kubelet/pods/b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f/volumes" Dec 17 09:28:23 crc kubenswrapper[4935]: I1217 09:28:23.665050 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 17 09:28:25 crc kubenswrapper[4935]: I1217 09:28:25.801436 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 17 09:28:28 crc kubenswrapper[4935]: I1217 09:28:28.567036 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="7c863e0e-b041-4e68-852f-addc7126a215" containerName="rabbitmq" containerID="cri-o://ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2" gracePeriod=604796 Dec 17 09:28:30 crc kubenswrapper[4935]: I1217 09:28:30.751998 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="90598781-7630-4807-9735-eb1cfaba2927" containerName="rabbitmq" containerID="cri-o://83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8" gracePeriod=604796 Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.175350 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.297620 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-plugins\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.297827 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-config-data\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.297875 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.298055 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-erlang-cookie\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.298186 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-plugins-conf\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.298258 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-tls\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.298430 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c863e0e-b041-4e68-852f-addc7126a215-pod-info\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.298619 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-server-conf\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.298698 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-confd\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.298818 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nx7l\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-kube-api-access-5nx7l\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.298907 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c863e0e-b041-4e68-852f-addc7126a215-erlang-cookie-secret\") pod \"7c863e0e-b041-4e68-852f-addc7126a215\" (UID: \"7c863e0e-b041-4e68-852f-addc7126a215\") " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.300390 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.300887 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.301543 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.302483 4935 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.302519 4935 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.305078 4935 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.329104 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/7c863e0e-b041-4e68-852f-addc7126a215-pod-info" (OuterVolumeSpecName: "pod-info") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.338047 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.338210 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c863e0e-b041-4e68-852f-addc7126a215-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.338440 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.338587 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-kube-api-access-5nx7l" (OuterVolumeSpecName: "kube-api-access-5nx7l") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "kube-api-access-5nx7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.395772 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-config-data" (OuterVolumeSpecName: "config-data") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.407406 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nx7l\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-kube-api-access-5nx7l\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.407465 4935 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/7c863e0e-b041-4e68-852f-addc7126a215-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.407482 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.407528 4935 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.407542 4935 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.407555 4935 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/7c863e0e-b041-4e68-852f-addc7126a215-pod-info\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.436412 4935 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.455365 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-server-conf" (OuterVolumeSpecName: "server-conf") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.484860 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "7c863e0e-b041-4e68-852f-addc7126a215" (UID: "7c863e0e-b041-4e68-852f-addc7126a215"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.509702 4935 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/7c863e0e-b041-4e68-852f-addc7126a215-server-conf\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.510017 4935 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/7c863e0e-b041-4e68-852f-addc7126a215-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.510100 4935 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.710195 4935 generic.go:334] "Generic (PLEG): container finished" podID="7c863e0e-b041-4e68-852f-addc7126a215" containerID="ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2" exitCode=0 Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.710345 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.710373 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c863e0e-b041-4e68-852f-addc7126a215","Type":"ContainerDied","Data":"ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2"} Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.712985 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"7c863e0e-b041-4e68-852f-addc7126a215","Type":"ContainerDied","Data":"017fef6e1dce470c680e41b49064b5b026ba6c89c257cc89fac90912886864bf"} Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.713055 4935 scope.go:117] "RemoveContainer" containerID="ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.763383 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.786201 4935 scope.go:117] "RemoveContainer" containerID="ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.802542 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.818489 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 17 09:28:35 crc kubenswrapper[4935]: E1217 09:28:35.819036 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerName="extract-content" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.819056 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerName="extract-content" Dec 17 09:28:35 crc kubenswrapper[4935]: E1217 09:28:35.819073 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerName="extract-utilities" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.819079 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerName="extract-utilities" Dec 17 09:28:35 crc kubenswrapper[4935]: E1217 09:28:35.819102 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c863e0e-b041-4e68-852f-addc7126a215" containerName="rabbitmq" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.819108 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c863e0e-b041-4e68-852f-addc7126a215" containerName="rabbitmq" Dec 17 09:28:35 crc kubenswrapper[4935]: E1217 09:28:35.819115 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerName="registry-server" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.819121 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerName="registry-server" Dec 17 09:28:35 crc kubenswrapper[4935]: E1217 09:28:35.819133 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c863e0e-b041-4e68-852f-addc7126a215" containerName="setup-container" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.819139 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c863e0e-b041-4e68-852f-addc7126a215" containerName="setup-container" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.819465 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c863e0e-b041-4e68-852f-addc7126a215" containerName="rabbitmq" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.819481 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03f2cc9-d5ff-4fc4-9c01-ab2572b0355f" containerName="registry-server" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.820664 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.826722 4935 scope.go:117] "RemoveContainer" containerID="ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2" Dec 17 09:28:35 crc kubenswrapper[4935]: E1217 09:28:35.827882 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2\": container with ID starting with ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2 not found: ID does not exist" containerID="ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.827922 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2"} err="failed to get container status \"ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2\": rpc error: code = NotFound desc = could not find container \"ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2\": container with ID starting with ba69da3db190af9f46959ea869f4343f9b75c357ccd69f64c6a3bdf7edf090d2 not found: ID does not exist" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.827956 4935 scope.go:117] "RemoveContainer" containerID="ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.828593 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.828922 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 17 09:28:35 crc kubenswrapper[4935]: E1217 09:28:35.829576 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515\": container with ID starting with ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515 not found: ID does not exist" containerID="ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.829625 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-mzrr5" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.829645 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515"} err="failed to get container status \"ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515\": rpc error: code = NotFound desc = could not find container \"ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515\": container with ID starting with ac038dcb2e3c897bc3fb1393372476ee3d0ea299444f90bc9aca51142cec7515 not found: ID does not exist" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.829755 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.844371 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.850750 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.851212 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.852019 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919189 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c25b888a-54e1-47f0-8932-fa07c02f30a1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919244 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919294 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c25b888a-54e1-47f0-8932-fa07c02f30a1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919323 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919366 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c25b888a-54e1-47f0-8932-fa07c02f30a1-config-data\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919404 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c25b888a-54e1-47f0-8932-fa07c02f30a1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919459 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919503 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919523 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgv4x\" (UniqueName: \"kubernetes.io/projected/c25b888a-54e1-47f0-8932-fa07c02f30a1-kube-api-access-dgv4x\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919545 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c25b888a-54e1-47f0-8932-fa07c02f30a1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:35 crc kubenswrapper[4935]: I1217 09:28:35.919565 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.021882 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c25b888a-54e1-47f0-8932-fa07c02f30a1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.021931 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.022039 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c25b888a-54e1-47f0-8932-fa07c02f30a1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.022068 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.022097 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c25b888a-54e1-47f0-8932-fa07c02f30a1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.022123 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.022157 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c25b888a-54e1-47f0-8932-fa07c02f30a1-config-data\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.022196 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c25b888a-54e1-47f0-8932-fa07c02f30a1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.022245 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.022297 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.022317 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgv4x\" (UniqueName: \"kubernetes.io/projected/c25b888a-54e1-47f0-8932-fa07c02f30a1-kube-api-access-dgv4x\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.023233 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.023355 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.023502 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.023779 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c25b888a-54e1-47f0-8932-fa07c02f30a1-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.023990 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c25b888a-54e1-47f0-8932-fa07c02f30a1-config-data\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.024896 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c25b888a-54e1-47f0-8932-fa07c02f30a1-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.031814 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c25b888a-54e1-47f0-8932-fa07c02f30a1-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.031857 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.031822 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c25b888a-54e1-47f0-8932-fa07c02f30a1-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.031929 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c25b888a-54e1-47f0-8932-fa07c02f30a1-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.044025 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgv4x\" (UniqueName: \"kubernetes.io/projected/c25b888a-54e1-47f0-8932-fa07c02f30a1-kube-api-access-dgv4x\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.060986 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c25b888a-54e1-47f0-8932-fa07c02f30a1\") " pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.204514 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 17 09:28:36 crc kubenswrapper[4935]: I1217 09:28:36.716102 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.150868 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c863e0e-b041-4e68-852f-addc7126a215" path="/var/lib/kubelet/pods/7c863e0e-b041-4e68-852f-addc7126a215/volumes" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.477780 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.566961 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8mx\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-kube-api-access-fc8mx\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567045 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-config-data\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567203 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-plugins\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567330 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-server-conf\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567357 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90598781-7630-4807-9735-eb1cfaba2927-erlang-cookie-secret\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567437 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567484 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-erlang-cookie\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567525 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-plugins-conf\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567562 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-confd\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567654 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-tls\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567701 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90598781-7630-4807-9735-eb1cfaba2927-pod-info\") pod \"90598781-7630-4807-9735-eb1cfaba2927\" (UID: \"90598781-7630-4807-9735-eb1cfaba2927\") " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.567768 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.568583 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.568802 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.569124 4935 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.569142 4935 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.569154 4935 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.597822 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-kube-api-access-fc8mx" (OuterVolumeSpecName: "kube-api-access-fc8mx") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "kube-api-access-fc8mx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.597943 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/90598781-7630-4807-9735-eb1cfaba2927-pod-info" (OuterVolumeSpecName: "pod-info") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.598005 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90598781-7630-4807-9735-eb1cfaba2927-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.598112 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "persistence") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.608634 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.612995 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-config-data" (OuterVolumeSpecName: "config-data") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.648068 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-server-conf" (OuterVolumeSpecName: "server-conf") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.670923 4935 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.670972 4935 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/90598781-7630-4807-9735-eb1cfaba2927-pod-info\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.670983 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8mx\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-kube-api-access-fc8mx\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.670997 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.671006 4935 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/90598781-7630-4807-9735-eb1cfaba2927-server-conf\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.671016 4935 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/90598781-7630-4807-9735-eb1cfaba2927-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.671054 4935 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.711620 4935 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.714658 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "90598781-7630-4807-9735-eb1cfaba2927" (UID: "90598781-7630-4807-9735-eb1cfaba2927"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.737874 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c25b888a-54e1-47f0-8932-fa07c02f30a1","Type":"ContainerStarted","Data":"04136e1720f0dd51cdd8a419e82d0b0c2434274c9cc4a879958f0b825e304a97"} Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.740512 4935 generic.go:334] "Generic (PLEG): container finished" podID="90598781-7630-4807-9735-eb1cfaba2927" containerID="83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8" exitCode=0 Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.740556 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90598781-7630-4807-9735-eb1cfaba2927","Type":"ContainerDied","Data":"83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8"} Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.740583 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"90598781-7630-4807-9735-eb1cfaba2927","Type":"ContainerDied","Data":"d8ee6bb81e6432a3b79cbfa5a86e4667399eb978152e3ad2b37754fdf8f9c6b8"} Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.740607 4935 scope.go:117] "RemoveContainer" containerID="83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.740765 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.776393 4935 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.776438 4935 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/90598781-7630-4807-9735-eb1cfaba2927-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.779878 4935 scope.go:117] "RemoveContainer" containerID="60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.801420 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.813405 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.819994 4935 scope.go:117] "RemoveContainer" containerID="83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8" Dec 17 09:28:37 crc kubenswrapper[4935]: E1217 09:28:37.822812 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8\": container with ID starting with 83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8 not found: ID does not exist" containerID="83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.822970 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8"} err="failed to get container status \"83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8\": rpc error: code = NotFound desc = could not find container \"83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8\": container with ID starting with 83a2363abf5469f8ea5a00a4b597e5cc43f6b467bee0c9eff371e24a2f8359b8 not found: ID does not exist" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.823170 4935 scope.go:117] "RemoveContainer" containerID="60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c" Dec 17 09:28:37 crc kubenswrapper[4935]: E1217 09:28:37.823602 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c\": container with ID starting with 60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c not found: ID does not exist" containerID="60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.823643 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c"} err="failed to get container status \"60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c\": rpc error: code = NotFound desc = could not find container \"60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c\": container with ID starting with 60cf140bb9d8c677f5deee1b128c932ae079b4d28d2b40cb3333db1be9a7e07c not found: ID does not exist" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.828619 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 17 09:28:37 crc kubenswrapper[4935]: E1217 09:28:37.829190 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90598781-7630-4807-9735-eb1cfaba2927" containerName="setup-container" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.829217 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="90598781-7630-4807-9735-eb1cfaba2927" containerName="setup-container" Dec 17 09:28:37 crc kubenswrapper[4935]: E1217 09:28:37.829238 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90598781-7630-4807-9735-eb1cfaba2927" containerName="rabbitmq" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.829245 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="90598781-7630-4807-9735-eb1cfaba2927" containerName="rabbitmq" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.829469 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="90598781-7630-4807-9735-eb1cfaba2927" containerName="rabbitmq" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.831908 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.838250 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.838903 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.839461 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.839755 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.845348 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.846717 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.847612 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-xcs2m" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.906994 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985435 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2fa8407-e7ea-46c7-8144-87caba8e2f45-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985535 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt9tt\" (UniqueName: \"kubernetes.io/projected/a2fa8407-e7ea-46c7-8144-87caba8e2f45-kube-api-access-bt9tt\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985577 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2fa8407-e7ea-46c7-8144-87caba8e2f45-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985615 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985650 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985673 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985690 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985728 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2fa8407-e7ea-46c7-8144-87caba8e2f45-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985777 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2fa8407-e7ea-46c7-8144-87caba8e2f45-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985805 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:37 crc kubenswrapper[4935]: I1217 09:28:37.985846 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2fa8407-e7ea-46c7-8144-87caba8e2f45-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.088204 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt9tt\" (UniqueName: \"kubernetes.io/projected/a2fa8407-e7ea-46c7-8144-87caba8e2f45-kube-api-access-bt9tt\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.088863 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2fa8407-e7ea-46c7-8144-87caba8e2f45-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.088890 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.088922 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.088940 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.088962 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.089002 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2fa8407-e7ea-46c7-8144-87caba8e2f45-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.089068 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2fa8407-e7ea-46c7-8144-87caba8e2f45-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.089103 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.089148 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2fa8407-e7ea-46c7-8144-87caba8e2f45-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.089198 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2fa8407-e7ea-46c7-8144-87caba8e2f45-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.089319 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.089845 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.090400 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.090549 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a2fa8407-e7ea-46c7-8144-87caba8e2f45-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.090590 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a2fa8407-e7ea-46c7-8144-87caba8e2f45-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.090627 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a2fa8407-e7ea-46c7-8144-87caba8e2f45-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.159610 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a2fa8407-e7ea-46c7-8144-87caba8e2f45-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.166642 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.172315 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a2fa8407-e7ea-46c7-8144-87caba8e2f45-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.172864 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt9tt\" (UniqueName: \"kubernetes.io/projected/a2fa8407-e7ea-46c7-8144-87caba8e2f45-kube-api-access-bt9tt\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.173696 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a2fa8407-e7ea-46c7-8144-87caba8e2f45-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.178872 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-nzlfp"] Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.189375 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.195351 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.205963 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-nzlfp"] Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.237394 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a2fa8407-e7ea-46c7-8144-87caba8e2f45\") " pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.293869 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.293985 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.294040 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-config\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.294084 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.294149 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.294171 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46vv\" (UniqueName: \"kubernetes.io/projected/bbf63a05-252b-4935-b580-be85606d308d-kube-api-access-g46vv\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.294215 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.396417 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.396497 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.396522 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g46vv\" (UniqueName: \"kubernetes.io/projected/bbf63a05-252b-4935-b580-be85606d308d-kube-api-access-g46vv\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.396571 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.396620 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.396682 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.396735 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-config\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.397993 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.397997 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.398180 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.398196 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.398459 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.398496 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-config\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.421406 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g46vv\" (UniqueName: \"kubernetes.io/projected/bbf63a05-252b-4935-b580-be85606d308d-kube-api-access-g46vv\") pod \"dnsmasq-dns-668b55cdd7-nzlfp\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.519821 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.608426 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:38 crc kubenswrapper[4935]: I1217 09:28:38.760963 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c25b888a-54e1-47f0-8932-fa07c02f30a1","Type":"ContainerStarted","Data":"292d73ea501c523c672b3b6cda5f0450ae686325b24b50211199e441f4b196c0"} Dec 17 09:28:39 crc kubenswrapper[4935]: W1217 09:28:39.036310 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2fa8407_e7ea_46c7_8144_87caba8e2f45.slice/crio-37eea42544a17c107f06d339e52d34f76d878e677e1e5eafec2568c39095e147 WatchSource:0}: Error finding container 37eea42544a17c107f06d339e52d34f76d878e677e1e5eafec2568c39095e147: Status 404 returned error can't find the container with id 37eea42544a17c107f06d339e52d34f76d878e677e1e5eafec2568c39095e147 Dec 17 09:28:39 crc kubenswrapper[4935]: I1217 09:28:39.036655 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 17 09:28:39 crc kubenswrapper[4935]: I1217 09:28:39.138107 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90598781-7630-4807-9735-eb1cfaba2927" path="/var/lib/kubelet/pods/90598781-7630-4807-9735-eb1cfaba2927/volumes" Dec 17 09:28:39 crc kubenswrapper[4935]: I1217 09:28:39.155644 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-nzlfp"] Dec 17 09:28:39 crc kubenswrapper[4935]: W1217 09:28:39.159976 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbf63a05_252b_4935_b580_be85606d308d.slice/crio-755d29826b706afe3d7c78c7393e2af5e5e4eef7e2c79d4050724eb10c2cd2da WatchSource:0}: Error finding container 755d29826b706afe3d7c78c7393e2af5e5e4eef7e2c79d4050724eb10c2cd2da: Status 404 returned error can't find the container with id 755d29826b706afe3d7c78c7393e2af5e5e4eef7e2c79d4050724eb10c2cd2da Dec 17 09:28:39 crc kubenswrapper[4935]: I1217 09:28:39.772922 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2fa8407-e7ea-46c7-8144-87caba8e2f45","Type":"ContainerStarted","Data":"37eea42544a17c107f06d339e52d34f76d878e677e1e5eafec2568c39095e147"} Dec 17 09:28:39 crc kubenswrapper[4935]: I1217 09:28:39.775113 4935 generic.go:334] "Generic (PLEG): container finished" podID="bbf63a05-252b-4935-b580-be85606d308d" containerID="6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92" exitCode=0 Dec 17 09:28:39 crc kubenswrapper[4935]: I1217 09:28:39.775203 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" event={"ID":"bbf63a05-252b-4935-b580-be85606d308d","Type":"ContainerDied","Data":"6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92"} Dec 17 09:28:39 crc kubenswrapper[4935]: I1217 09:28:39.775251 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" event={"ID":"bbf63a05-252b-4935-b580-be85606d308d","Type":"ContainerStarted","Data":"755d29826b706afe3d7c78c7393e2af5e5e4eef7e2c79d4050724eb10c2cd2da"} Dec 17 09:28:40 crc kubenswrapper[4935]: I1217 09:28:40.798046 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" event={"ID":"bbf63a05-252b-4935-b580-be85606d308d","Type":"ContainerStarted","Data":"d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95"} Dec 17 09:28:40 crc kubenswrapper[4935]: I1217 09:28:40.798513 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:40 crc kubenswrapper[4935]: I1217 09:28:40.818713 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" podStartSLOduration=2.818684283 podStartE2EDuration="2.818684283s" podCreationTimestamp="2025-12-17 09:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:28:40.817415662 +0000 UTC m=+1440.477256425" watchObservedRunningTime="2025-12-17 09:28:40.818684283 +0000 UTC m=+1440.478525076" Dec 17 09:28:41 crc kubenswrapper[4935]: I1217 09:28:41.812445 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2fa8407-e7ea-46c7-8144-87caba8e2f45","Type":"ContainerStarted","Data":"8b4993274ea13f19548d809e72e198cf4c03f92392ad3a63744f5795dc153c6a"} Dec 17 09:28:48 crc kubenswrapper[4935]: I1217 09:28:48.610486 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:28:48 crc kubenswrapper[4935]: I1217 09:28:48.685721 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-bgbn2"] Dec 17 09:28:48 crc kubenswrapper[4935]: I1217 09:28:48.686008 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" podUID="95d6599e-4042-4043-a505-9e1ca6037bf2" containerName="dnsmasq-dns" containerID="cri-o://1f8aa261dff5dd5c48b20c4e7d4ed95b443f039de1ab6dca9a2bbf7701779967" gracePeriod=10 Dec 17 09:28:48 crc kubenswrapper[4935]: I1217 09:28:48.920125 4935 generic.go:334] "Generic (PLEG): container finished" podID="95d6599e-4042-4043-a505-9e1ca6037bf2" containerID="1f8aa261dff5dd5c48b20c4e7d4ed95b443f039de1ab6dca9a2bbf7701779967" exitCode=0 Dec 17 09:28:48 crc kubenswrapper[4935]: I1217 09:28:48.920186 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" event={"ID":"95d6599e-4042-4043-a505-9e1ca6037bf2","Type":"ContainerDied","Data":"1f8aa261dff5dd5c48b20c4e7d4ed95b443f039de1ab6dca9a2bbf7701779967"} Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.094104 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-rp2ht"] Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.097117 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.107928 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-rp2ht"] Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.185376 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-config\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.185723 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.185830 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.185964 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.186078 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4mdz\" (UniqueName: \"kubernetes.io/projected/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-kube-api-access-q4mdz\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.186194 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.186325 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.290920 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.291027 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4mdz\" (UniqueName: \"kubernetes.io/projected/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-kube-api-access-q4mdz\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.291084 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.291147 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.291178 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-config\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.291196 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.291237 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.292384 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.293091 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.294019 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.294579 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.295080 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-config\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.295618 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.336345 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4mdz\" (UniqueName: \"kubernetes.io/projected/dd89ba3b-030b-43ae-9e1a-89d1401f81cd-kube-api-access-q4mdz\") pod \"dnsmasq-dns-66fc59ccbf-rp2ht\" (UID: \"dd89ba3b-030b-43ae-9e1a-89d1401f81cd\") " pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.430942 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.454458 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.497597 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rr8s\" (UniqueName: \"kubernetes.io/projected/95d6599e-4042-4043-a505-9e1ca6037bf2-kube-api-access-2rr8s\") pod \"95d6599e-4042-4043-a505-9e1ca6037bf2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.497681 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-sb\") pod \"95d6599e-4042-4043-a505-9e1ca6037bf2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.497797 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-svc\") pod \"95d6599e-4042-4043-a505-9e1ca6037bf2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.497837 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-config\") pod \"95d6599e-4042-4043-a505-9e1ca6037bf2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.498008 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-nb\") pod \"95d6599e-4042-4043-a505-9e1ca6037bf2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.498042 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-swift-storage-0\") pod \"95d6599e-4042-4043-a505-9e1ca6037bf2\" (UID: \"95d6599e-4042-4043-a505-9e1ca6037bf2\") " Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.510469 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d6599e-4042-4043-a505-9e1ca6037bf2-kube-api-access-2rr8s" (OuterVolumeSpecName: "kube-api-access-2rr8s") pod "95d6599e-4042-4043-a505-9e1ca6037bf2" (UID: "95d6599e-4042-4043-a505-9e1ca6037bf2"). InnerVolumeSpecName "kube-api-access-2rr8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.567783 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "95d6599e-4042-4043-a505-9e1ca6037bf2" (UID: "95d6599e-4042-4043-a505-9e1ca6037bf2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.591659 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "95d6599e-4042-4043-a505-9e1ca6037bf2" (UID: "95d6599e-4042-4043-a505-9e1ca6037bf2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.606064 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.606467 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rr8s\" (UniqueName: \"kubernetes.io/projected/95d6599e-4042-4043-a505-9e1ca6037bf2-kube-api-access-2rr8s\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.606481 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.627860 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "95d6599e-4042-4043-a505-9e1ca6037bf2" (UID: "95d6599e-4042-4043-a505-9e1ca6037bf2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.630085 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-config" (OuterVolumeSpecName: "config") pod "95d6599e-4042-4043-a505-9e1ca6037bf2" (UID: "95d6599e-4042-4043-a505-9e1ca6037bf2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.636913 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "95d6599e-4042-4043-a505-9e1ca6037bf2" (UID: "95d6599e-4042-4043-a505-9e1ca6037bf2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.708975 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.709023 4935 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.709063 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95d6599e-4042-4043-a505-9e1ca6037bf2-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.936832 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" event={"ID":"95d6599e-4042-4043-a505-9e1ca6037bf2","Type":"ContainerDied","Data":"ae57d86c0f132096d203b25527a449e80e887af29f2d49091d9bbf9a3b7e9acc"} Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.936917 4935 scope.go:117] "RemoveContainer" containerID="1f8aa261dff5dd5c48b20c4e7d4ed95b443f039de1ab6dca9a2bbf7701779967" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.936930 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-bgbn2" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.987103 4935 scope.go:117] "RemoveContainer" containerID="18bbb5fbed6234e6527ef3578c016b6b16ab402b161450c578a5d4a6a4e457fc" Dec 17 09:28:49 crc kubenswrapper[4935]: I1217 09:28:49.989402 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-bgbn2"] Dec 17 09:28:50 crc kubenswrapper[4935]: I1217 09:28:50.008330 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-bgbn2"] Dec 17 09:28:50 crc kubenswrapper[4935]: I1217 09:28:50.019063 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-rp2ht"] Dec 17 09:28:50 crc kubenswrapper[4935]: W1217 09:28:50.033636 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd89ba3b_030b_43ae_9e1a_89d1401f81cd.slice/crio-bee09271faf76b1361d0b4547b9ee7f11ff45b0906dbdbf0a728b9c02499ad01 WatchSource:0}: Error finding container bee09271faf76b1361d0b4547b9ee7f11ff45b0906dbdbf0a728b9c02499ad01: Status 404 returned error can't find the container with id bee09271faf76b1361d0b4547b9ee7f11ff45b0906dbdbf0a728b9c02499ad01 Dec 17 09:28:50 crc kubenswrapper[4935]: I1217 09:28:50.952192 4935 generic.go:334] "Generic (PLEG): container finished" podID="dd89ba3b-030b-43ae-9e1a-89d1401f81cd" containerID="55cce73e8bfc156795f4f725ff9a98a7f943f5533c4581c4c9d8aae83e0d9f3a" exitCode=0 Dec 17 09:28:50 crc kubenswrapper[4935]: I1217 09:28:50.952320 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" event={"ID":"dd89ba3b-030b-43ae-9e1a-89d1401f81cd","Type":"ContainerDied","Data":"55cce73e8bfc156795f4f725ff9a98a7f943f5533c4581c4c9d8aae83e0d9f3a"} Dec 17 09:28:50 crc kubenswrapper[4935]: I1217 09:28:50.952848 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" event={"ID":"dd89ba3b-030b-43ae-9e1a-89d1401f81cd","Type":"ContainerStarted","Data":"bee09271faf76b1361d0b4547b9ee7f11ff45b0906dbdbf0a728b9c02499ad01"} Dec 17 09:28:51 crc kubenswrapper[4935]: I1217 09:28:51.144862 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d6599e-4042-4043-a505-9e1ca6037bf2" path="/var/lib/kubelet/pods/95d6599e-4042-4043-a505-9e1ca6037bf2/volumes" Dec 17 09:28:51 crc kubenswrapper[4935]: I1217 09:28:51.973683 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" event={"ID":"dd89ba3b-030b-43ae-9e1a-89d1401f81cd","Type":"ContainerStarted","Data":"9d86aec906c4d7398cfe6d2582a4c964469ff146da9db1224fffa6172f22ebde"} Dec 17 09:28:51 crc kubenswrapper[4935]: I1217 09:28:51.974514 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:51 crc kubenswrapper[4935]: I1217 09:28:51.998238 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" podStartSLOduration=2.998213184 podStartE2EDuration="2.998213184s" podCreationTimestamp="2025-12-17 09:28:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:28:51.994120345 +0000 UTC m=+1451.653961118" watchObservedRunningTime="2025-12-17 09:28:51.998213184 +0000 UTC m=+1451.658053957" Dec 17 09:28:59 crc kubenswrapper[4935]: I1217 09:28:59.433548 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66fc59ccbf-rp2ht" Dec 17 09:28:59 crc kubenswrapper[4935]: I1217 09:28:59.507756 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-nzlfp"] Dec 17 09:28:59 crc kubenswrapper[4935]: I1217 09:28:59.508094 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" podUID="bbf63a05-252b-4935-b580-be85606d308d" containerName="dnsmasq-dns" containerID="cri-o://d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95" gracePeriod=10 Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.060354 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.063243 4935 generic.go:334] "Generic (PLEG): container finished" podID="bbf63a05-252b-4935-b580-be85606d308d" containerID="d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95" exitCode=0 Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.063339 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" event={"ID":"bbf63a05-252b-4935-b580-be85606d308d","Type":"ContainerDied","Data":"d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95"} Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.063386 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" event={"ID":"bbf63a05-252b-4935-b580-be85606d308d","Type":"ContainerDied","Data":"755d29826b706afe3d7c78c7393e2af5e5e4eef7e2c79d4050724eb10c2cd2da"} Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.063407 4935 scope.go:117] "RemoveContainer" containerID="d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.110144 4935 scope.go:117] "RemoveContainer" containerID="6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.142721 4935 scope.go:117] "RemoveContainer" containerID="d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95" Dec 17 09:29:00 crc kubenswrapper[4935]: E1217 09:29:00.143634 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95\": container with ID starting with d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95 not found: ID does not exist" containerID="d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.143673 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95"} err="failed to get container status \"d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95\": rpc error: code = NotFound desc = could not find container \"d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95\": container with ID starting with d45c24a12e13b25139fcfbb464e5e325cd8ed3cc31f7a9b9ec9b9c7969d55c95 not found: ID does not exist" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.143701 4935 scope.go:117] "RemoveContainer" containerID="6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92" Dec 17 09:29:00 crc kubenswrapper[4935]: E1217 09:29:00.144062 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92\": container with ID starting with 6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92 not found: ID does not exist" containerID="6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.144117 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92"} err="failed to get container status \"6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92\": rpc error: code = NotFound desc = could not find container \"6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92\": container with ID starting with 6b7e7ffb435e904184a0c2384f472d3a9e90dcd00d4e992b379494d85d924e92 not found: ID does not exist" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.170155 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-config\") pod \"bbf63a05-252b-4935-b580-be85606d308d\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.170237 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-svc\") pod \"bbf63a05-252b-4935-b580-be85606d308d\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.170292 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g46vv\" (UniqueName: \"kubernetes.io/projected/bbf63a05-252b-4935-b580-be85606d308d-kube-api-access-g46vv\") pod \"bbf63a05-252b-4935-b580-be85606d308d\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.170394 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-nb\") pod \"bbf63a05-252b-4935-b580-be85606d308d\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.170445 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-swift-storage-0\") pod \"bbf63a05-252b-4935-b580-be85606d308d\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.170539 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-sb\") pod \"bbf63a05-252b-4935-b580-be85606d308d\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.170744 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-openstack-edpm-ipam\") pod \"bbf63a05-252b-4935-b580-be85606d308d\" (UID: \"bbf63a05-252b-4935-b580-be85606d308d\") " Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.177714 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbf63a05-252b-4935-b580-be85606d308d-kube-api-access-g46vv" (OuterVolumeSpecName: "kube-api-access-g46vv") pod "bbf63a05-252b-4935-b580-be85606d308d" (UID: "bbf63a05-252b-4935-b580-be85606d308d"). InnerVolumeSpecName "kube-api-access-g46vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.237095 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bbf63a05-252b-4935-b580-be85606d308d" (UID: "bbf63a05-252b-4935-b580-be85606d308d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.237511 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-config" (OuterVolumeSpecName: "config") pod "bbf63a05-252b-4935-b580-be85606d308d" (UID: "bbf63a05-252b-4935-b580-be85606d308d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.237971 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bbf63a05-252b-4935-b580-be85606d308d" (UID: "bbf63a05-252b-4935-b580-be85606d308d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.248975 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "bbf63a05-252b-4935-b580-be85606d308d" (UID: "bbf63a05-252b-4935-b580-be85606d308d"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.251429 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bbf63a05-252b-4935-b580-be85606d308d" (UID: "bbf63a05-252b-4935-b580-be85606d308d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.254738 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bbf63a05-252b-4935-b580-be85606d308d" (UID: "bbf63a05-252b-4935-b580-be85606d308d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.274192 4935 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-config\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.274253 4935 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.274264 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g46vv\" (UniqueName: \"kubernetes.io/projected/bbf63a05-252b-4935-b580-be85606d308d-kube-api-access-g46vv\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.274314 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.274324 4935 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.274334 4935 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:00 crc kubenswrapper[4935]: I1217 09:29:00.274343 4935 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/bbf63a05-252b-4935-b580-be85606d308d-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:01 crc kubenswrapper[4935]: I1217 09:29:01.075259 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-nzlfp" Dec 17 09:29:01 crc kubenswrapper[4935]: I1217 09:29:01.109235 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-nzlfp"] Dec 17 09:29:01 crc kubenswrapper[4935]: I1217 09:29:01.136696 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-nzlfp"] Dec 17 09:29:03 crc kubenswrapper[4935]: I1217 09:29:03.140928 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbf63a05-252b-4935-b580-be85606d308d" path="/var/lib/kubelet/pods/bbf63a05-252b-4935-b580-be85606d308d/volumes" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.334476 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284"] Dec 17 09:29:08 crc kubenswrapper[4935]: E1217 09:29:08.336733 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d6599e-4042-4043-a505-9e1ca6037bf2" containerName="init" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.336775 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d6599e-4042-4043-a505-9e1ca6037bf2" containerName="init" Dec 17 09:29:08 crc kubenswrapper[4935]: E1217 09:29:08.336834 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf63a05-252b-4935-b580-be85606d308d" containerName="dnsmasq-dns" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.336853 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf63a05-252b-4935-b580-be85606d308d" containerName="dnsmasq-dns" Dec 17 09:29:08 crc kubenswrapper[4935]: E1217 09:29:08.336908 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d6599e-4042-4043-a505-9e1ca6037bf2" containerName="dnsmasq-dns" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.336927 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d6599e-4042-4043-a505-9e1ca6037bf2" containerName="dnsmasq-dns" Dec 17 09:29:08 crc kubenswrapper[4935]: E1217 09:29:08.336976 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbf63a05-252b-4935-b580-be85606d308d" containerName="init" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.336992 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbf63a05-252b-4935-b580-be85606d308d" containerName="init" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.337512 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbf63a05-252b-4935-b580-be85606d308d" containerName="dnsmasq-dns" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.337618 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d6599e-4042-4043-a505-9e1ca6037bf2" containerName="dnsmasq-dns" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.339209 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.341664 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.341835 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.341963 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.345236 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.345653 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284"] Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.483290 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pbsh\" (UniqueName: \"kubernetes.io/projected/5ce01088-f49b-44be-b4a9-08cd183488de-kube-api-access-4pbsh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.483380 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.483804 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.484099 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.587459 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.587630 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pbsh\" (UniqueName: \"kubernetes.io/projected/5ce01088-f49b-44be-b4a9-08cd183488de-kube-api-access-4pbsh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.587710 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.587829 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.595599 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.596835 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.597322 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.623618 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pbsh\" (UniqueName: \"kubernetes.io/projected/5ce01088-f49b-44be-b4a9-08cd183488de-kube-api-access-4pbsh\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-vs284\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:08 crc kubenswrapper[4935]: I1217 09:29:08.674215 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:09 crc kubenswrapper[4935]: I1217 09:29:09.253116 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284"] Dec 17 09:29:10 crc kubenswrapper[4935]: I1217 09:29:10.172217 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" event={"ID":"5ce01088-f49b-44be-b4a9-08cd183488de","Type":"ContainerStarted","Data":"fe9ea1c0c895b898a4d6419cdd61e31c83eaba5cd1fc2a8f316a7863d99926e4"} Dec 17 09:29:11 crc kubenswrapper[4935]: I1217 09:29:11.194073 4935 generic.go:334] "Generic (PLEG): container finished" podID="c25b888a-54e1-47f0-8932-fa07c02f30a1" containerID="292d73ea501c523c672b3b6cda5f0450ae686325b24b50211199e441f4b196c0" exitCode=0 Dec 17 09:29:11 crc kubenswrapper[4935]: I1217 09:29:11.194170 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c25b888a-54e1-47f0-8932-fa07c02f30a1","Type":"ContainerDied","Data":"292d73ea501c523c672b3b6cda5f0450ae686325b24b50211199e441f4b196c0"} Dec 17 09:29:13 crc kubenswrapper[4935]: I1217 09:29:13.233884 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c25b888a-54e1-47f0-8932-fa07c02f30a1","Type":"ContainerStarted","Data":"6d3e40ea517fbd9fa8d67291d0b2b70fb7b167035144a7812e91349fec030c3f"} Dec 17 09:29:13 crc kubenswrapper[4935]: I1217 09:29:13.235106 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 17 09:29:14 crc kubenswrapper[4935]: I1217 09:29:14.247049 4935 generic.go:334] "Generic (PLEG): container finished" podID="a2fa8407-e7ea-46c7-8144-87caba8e2f45" containerID="8b4993274ea13f19548d809e72e198cf4c03f92392ad3a63744f5795dc153c6a" exitCode=0 Dec 17 09:29:14 crc kubenswrapper[4935]: I1217 09:29:14.247742 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2fa8407-e7ea-46c7-8144-87caba8e2f45","Type":"ContainerDied","Data":"8b4993274ea13f19548d809e72e198cf4c03f92392ad3a63744f5795dc153c6a"} Dec 17 09:29:14 crc kubenswrapper[4935]: I1217 09:29:14.283840 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.283818541 podStartE2EDuration="39.283818541s" podCreationTimestamp="2025-12-17 09:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:29:13.271858185 +0000 UTC m=+1472.931698948" watchObservedRunningTime="2025-12-17 09:29:14.283818541 +0000 UTC m=+1473.943659304" Dec 17 09:29:21 crc kubenswrapper[4935]: I1217 09:29:21.327828 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a2fa8407-e7ea-46c7-8144-87caba8e2f45","Type":"ContainerStarted","Data":"6767e0066ee1ad21aab0e0e8485cd572076458730aa1c14b3f10d92435a1bfe9"} Dec 17 09:29:21 crc kubenswrapper[4935]: I1217 09:29:21.330206 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:29:21 crc kubenswrapper[4935]: I1217 09:29:21.331926 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" event={"ID":"5ce01088-f49b-44be-b4a9-08cd183488de","Type":"ContainerStarted","Data":"d126c8b75d2a630a18d92972e33ef42ac24123dbacd086eb866b825d78b3c539"} Dec 17 09:29:21 crc kubenswrapper[4935]: I1217 09:29:21.371323 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.371292694 podStartE2EDuration="44.371292694s" podCreationTimestamp="2025-12-17 09:28:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 09:29:21.363813601 +0000 UTC m=+1481.023654364" watchObservedRunningTime="2025-12-17 09:29:21.371292694 +0000 UTC m=+1481.031133477" Dec 17 09:29:26 crc kubenswrapper[4935]: I1217 09:29:26.207841 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 17 09:29:26 crc kubenswrapper[4935]: I1217 09:29:26.232029 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" podStartSLOduration=6.727965241 podStartE2EDuration="18.232004272s" podCreationTimestamp="2025-12-17 09:29:08 +0000 UTC" firstStartedPulling="2025-12-17 09:29:09.263597224 +0000 UTC m=+1468.923437987" lastFinishedPulling="2025-12-17 09:29:20.767636255 +0000 UTC m=+1480.427477018" observedRunningTime="2025-12-17 09:29:21.38966554 +0000 UTC m=+1481.049506333" watchObservedRunningTime="2025-12-17 09:29:26.232004272 +0000 UTC m=+1485.891845035" Dec 17 09:29:33 crc kubenswrapper[4935]: I1217 09:29:33.479539 4935 generic.go:334] "Generic (PLEG): container finished" podID="5ce01088-f49b-44be-b4a9-08cd183488de" containerID="d126c8b75d2a630a18d92972e33ef42ac24123dbacd086eb866b825d78b3c539" exitCode=0 Dec 17 09:29:33 crc kubenswrapper[4935]: I1217 09:29:33.479670 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" event={"ID":"5ce01088-f49b-44be-b4a9-08cd183488de","Type":"ContainerDied","Data":"d126c8b75d2a630a18d92972e33ef42ac24123dbacd086eb866b825d78b3c539"} Dec 17 09:29:34 crc kubenswrapper[4935]: I1217 09:29:34.980056 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.136917 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-repo-setup-combined-ca-bundle\") pod \"5ce01088-f49b-44be-b4a9-08cd183488de\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.137159 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-ssh-key\") pod \"5ce01088-f49b-44be-b4a9-08cd183488de\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.137246 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-inventory\") pod \"5ce01088-f49b-44be-b4a9-08cd183488de\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.137337 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pbsh\" (UniqueName: \"kubernetes.io/projected/5ce01088-f49b-44be-b4a9-08cd183488de-kube-api-access-4pbsh\") pod \"5ce01088-f49b-44be-b4a9-08cd183488de\" (UID: \"5ce01088-f49b-44be-b4a9-08cd183488de\") " Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.144647 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce01088-f49b-44be-b4a9-08cd183488de-kube-api-access-4pbsh" (OuterVolumeSpecName: "kube-api-access-4pbsh") pod "5ce01088-f49b-44be-b4a9-08cd183488de" (UID: "5ce01088-f49b-44be-b4a9-08cd183488de"). InnerVolumeSpecName "kube-api-access-4pbsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.145256 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5ce01088-f49b-44be-b4a9-08cd183488de" (UID: "5ce01088-f49b-44be-b4a9-08cd183488de"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.167395 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5ce01088-f49b-44be-b4a9-08cd183488de" (UID: "5ce01088-f49b-44be-b4a9-08cd183488de"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.168049 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-inventory" (OuterVolumeSpecName: "inventory") pod "5ce01088-f49b-44be-b4a9-08cd183488de" (UID: "5ce01088-f49b-44be-b4a9-08cd183488de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.241558 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.241596 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.241608 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pbsh\" (UniqueName: \"kubernetes.io/projected/5ce01088-f49b-44be-b4a9-08cd183488de-kube-api-access-4pbsh\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.241620 4935 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce01088-f49b-44be-b4a9-08cd183488de-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.504219 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" event={"ID":"5ce01088-f49b-44be-b4a9-08cd183488de","Type":"ContainerDied","Data":"fe9ea1c0c895b898a4d6419cdd61e31c83eaba5cd1fc2a8f316a7863d99926e4"} Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.504329 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-vs284" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.504336 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe9ea1c0c895b898a4d6419cdd61e31c83eaba5cd1fc2a8f316a7863d99926e4" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.615009 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx"] Dec 17 09:29:35 crc kubenswrapper[4935]: E1217 09:29:35.615659 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce01088-f49b-44be-b4a9-08cd183488de" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.615685 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce01088-f49b-44be-b4a9-08cd183488de" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.615973 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce01088-f49b-44be-b4a9-08cd183488de" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.617091 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.619214 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.619808 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.620579 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.621480 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.629740 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx"] Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.754864 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mwzr\" (UniqueName: \"kubernetes.io/projected/d34c1a27-0426-4b46-bf51-77110a3929cd-kube-api-access-5mwzr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9znx\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.755554 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9znx\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.755602 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9znx\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.857690 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9znx\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.857782 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9znx\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.857898 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mwzr\" (UniqueName: \"kubernetes.io/projected/d34c1a27-0426-4b46-bf51-77110a3929cd-kube-api-access-5mwzr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9znx\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.864724 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9znx\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.864851 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9znx\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.886994 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mwzr\" (UniqueName: \"kubernetes.io/projected/d34c1a27-0426-4b46-bf51-77110a3929cd-kube-api-access-5mwzr\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-m9znx\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:35 crc kubenswrapper[4935]: I1217 09:29:35.937726 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:36 crc kubenswrapper[4935]: I1217 09:29:36.536393 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx"] Dec 17 09:29:37 crc kubenswrapper[4935]: I1217 09:29:37.529895 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" event={"ID":"d34c1a27-0426-4b46-bf51-77110a3929cd","Type":"ContainerStarted","Data":"42ee9ecace1dc32335d382f02b5b7017ba948fdb240e5a4c6daebc5d35a8f0f4"} Dec 17 09:29:37 crc kubenswrapper[4935]: I1217 09:29:37.530602 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" event={"ID":"d34c1a27-0426-4b46-bf51-77110a3929cd","Type":"ContainerStarted","Data":"89f045642bd68a1326862248c1f59c3d0e20f4ca43c4bd68d3107fd208d29845"} Dec 17 09:29:37 crc kubenswrapper[4935]: I1217 09:29:37.556044 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" podStartSLOduration=2.062438156 podStartE2EDuration="2.556023067s" podCreationTimestamp="2025-12-17 09:29:35 +0000 UTC" firstStartedPulling="2025-12-17 09:29:36.546179382 +0000 UTC m=+1496.206020145" lastFinishedPulling="2025-12-17 09:29:37.039764253 +0000 UTC m=+1496.699605056" observedRunningTime="2025-12-17 09:29:37.55493277 +0000 UTC m=+1497.214773573" watchObservedRunningTime="2025-12-17 09:29:37.556023067 +0000 UTC m=+1497.215863830" Dec 17 09:29:38 crc kubenswrapper[4935]: I1217 09:29:38.523552 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 17 09:29:40 crc kubenswrapper[4935]: I1217 09:29:40.563227 4935 generic.go:334] "Generic (PLEG): container finished" podID="d34c1a27-0426-4b46-bf51-77110a3929cd" containerID="42ee9ecace1dc32335d382f02b5b7017ba948fdb240e5a4c6daebc5d35a8f0f4" exitCode=0 Dec 17 09:29:40 crc kubenswrapper[4935]: I1217 09:29:40.563318 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" event={"ID":"d34c1a27-0426-4b46-bf51-77110a3929cd","Type":"ContainerDied","Data":"42ee9ecace1dc32335d382f02b5b7017ba948fdb240e5a4c6daebc5d35a8f0f4"} Dec 17 09:29:41 crc kubenswrapper[4935]: I1217 09:29:41.983620 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.131319 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-inventory\") pod \"d34c1a27-0426-4b46-bf51-77110a3929cd\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.131503 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-ssh-key\") pod \"d34c1a27-0426-4b46-bf51-77110a3929cd\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.131655 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mwzr\" (UniqueName: \"kubernetes.io/projected/d34c1a27-0426-4b46-bf51-77110a3929cd-kube-api-access-5mwzr\") pod \"d34c1a27-0426-4b46-bf51-77110a3929cd\" (UID: \"d34c1a27-0426-4b46-bf51-77110a3929cd\") " Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.139590 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34c1a27-0426-4b46-bf51-77110a3929cd-kube-api-access-5mwzr" (OuterVolumeSpecName: "kube-api-access-5mwzr") pod "d34c1a27-0426-4b46-bf51-77110a3929cd" (UID: "d34c1a27-0426-4b46-bf51-77110a3929cd"). InnerVolumeSpecName "kube-api-access-5mwzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.172550 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-inventory" (OuterVolumeSpecName: "inventory") pod "d34c1a27-0426-4b46-bf51-77110a3929cd" (UID: "d34c1a27-0426-4b46-bf51-77110a3929cd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.191486 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d34c1a27-0426-4b46-bf51-77110a3929cd" (UID: "d34c1a27-0426-4b46-bf51-77110a3929cd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.235462 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.235516 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mwzr\" (UniqueName: \"kubernetes.io/projected/d34c1a27-0426-4b46-bf51-77110a3929cd-kube-api-access-5mwzr\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.235541 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d34c1a27-0426-4b46-bf51-77110a3929cd-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.587767 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" event={"ID":"d34c1a27-0426-4b46-bf51-77110a3929cd","Type":"ContainerDied","Data":"89f045642bd68a1326862248c1f59c3d0e20f4ca43c4bd68d3107fd208d29845"} Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.587817 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89f045642bd68a1326862248c1f59c3d0e20f4ca43c4bd68d3107fd208d29845" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.587863 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-m9znx" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.723804 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz"] Dec 17 09:29:42 crc kubenswrapper[4935]: E1217 09:29:42.724401 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34c1a27-0426-4b46-bf51-77110a3929cd" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.724423 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34c1a27-0426-4b46-bf51-77110a3929cd" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.724622 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34c1a27-0426-4b46-bf51-77110a3929cd" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.725563 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.725630 4935 scope.go:117] "RemoveContainer" containerID="369e0278087dc0bd0f384b1fa90f4ab44f3c28c60b52cbac9c4d94759ed8c3c1" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.730693 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.730857 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.731233 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.740548 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.741776 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz"] Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.862259 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.873573 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.874015 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gj9m\" (UniqueName: \"kubernetes.io/projected/1b3c1c73-3f87-4383-9d09-1931001f0629-kube-api-access-8gj9m\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.874379 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.977621 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.977810 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.977875 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.977928 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gj9m\" (UniqueName: \"kubernetes.io/projected/1b3c1c73-3f87-4383-9d09-1931001f0629-kube-api-access-8gj9m\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.984112 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.984636 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:42 crc kubenswrapper[4935]: I1217 09:29:42.985164 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:43 crc kubenswrapper[4935]: I1217 09:29:43.005212 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gj9m\" (UniqueName: \"kubernetes.io/projected/1b3c1c73-3f87-4383-9d09-1931001f0629-kube-api-access-8gj9m\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:43 crc kubenswrapper[4935]: I1217 09:29:43.052460 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:29:43 crc kubenswrapper[4935]: I1217 09:29:43.594064 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz"] Dec 17 09:29:44 crc kubenswrapper[4935]: I1217 09:29:44.631493 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" event={"ID":"1b3c1c73-3f87-4383-9d09-1931001f0629","Type":"ContainerStarted","Data":"3d19c5e7ea47a0f5b2a81a48e0295d0281e4f4f6320f053dc46e0e51bff41958"} Dec 17 09:29:44 crc kubenswrapper[4935]: I1217 09:29:44.632520 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" event={"ID":"1b3c1c73-3f87-4383-9d09-1931001f0629","Type":"ContainerStarted","Data":"a56b13cfd4ded465f995e7ba9570b6dad17d37629de903c7ae23838148397512"} Dec 17 09:29:44 crc kubenswrapper[4935]: I1217 09:29:44.660403 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" podStartSLOduration=2.209383943 podStartE2EDuration="2.66038596s" podCreationTimestamp="2025-12-17 09:29:42 +0000 UTC" firstStartedPulling="2025-12-17 09:29:43.605352576 +0000 UTC m=+1503.265193339" lastFinishedPulling="2025-12-17 09:29:44.056354593 +0000 UTC m=+1503.716195356" observedRunningTime="2025-12-17 09:29:44.652702712 +0000 UTC m=+1504.312543475" watchObservedRunningTime="2025-12-17 09:29:44.66038596 +0000 UTC m=+1504.320226723" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.133972 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.135086 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.218354 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d"] Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.220220 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.242667 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.243042 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.244231 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d"] Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.271160 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2bb91fe-0bf5-4422-a58c-0677652f6baa-secret-volume\") pod \"collect-profiles-29432730-9tg7d\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.271253 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4vgf\" (UniqueName: \"kubernetes.io/projected/b2bb91fe-0bf5-4422-a58c-0677652f6baa-kube-api-access-j4vgf\") pod \"collect-profiles-29432730-9tg7d\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.271422 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2bb91fe-0bf5-4422-a58c-0677652f6baa-config-volume\") pod \"collect-profiles-29432730-9tg7d\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.372952 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2bb91fe-0bf5-4422-a58c-0677652f6baa-secret-volume\") pod \"collect-profiles-29432730-9tg7d\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.373020 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4vgf\" (UniqueName: \"kubernetes.io/projected/b2bb91fe-0bf5-4422-a58c-0677652f6baa-kube-api-access-j4vgf\") pod \"collect-profiles-29432730-9tg7d\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.373083 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2bb91fe-0bf5-4422-a58c-0677652f6baa-config-volume\") pod \"collect-profiles-29432730-9tg7d\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.374104 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2bb91fe-0bf5-4422-a58c-0677652f6baa-config-volume\") pod \"collect-profiles-29432730-9tg7d\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.380792 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2bb91fe-0bf5-4422-a58c-0677652f6baa-secret-volume\") pod \"collect-profiles-29432730-9tg7d\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.393233 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4vgf\" (UniqueName: \"kubernetes.io/projected/b2bb91fe-0bf5-4422-a58c-0677652f6baa-kube-api-access-j4vgf\") pod \"collect-profiles-29432730-9tg7d\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:00 crc kubenswrapper[4935]: I1217 09:30:00.570410 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:01 crc kubenswrapper[4935]: I1217 09:30:01.784265 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d"] Dec 17 09:30:01 crc kubenswrapper[4935]: I1217 09:30:01.827506 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" event={"ID":"b2bb91fe-0bf5-4422-a58c-0677652f6baa","Type":"ContainerStarted","Data":"b2a969d120be2481a5df8e82876588b4208209a594ea555a579844e222528291"} Dec 17 09:30:02 crc kubenswrapper[4935]: I1217 09:30:02.839151 4935 generic.go:334] "Generic (PLEG): container finished" podID="b2bb91fe-0bf5-4422-a58c-0677652f6baa" containerID="40815833fa0990eb933830d7c29c34fdf89f68c098f6d02445d74910128e0333" exitCode=0 Dec 17 09:30:02 crc kubenswrapper[4935]: I1217 09:30:02.839244 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" event={"ID":"b2bb91fe-0bf5-4422-a58c-0677652f6baa","Type":"ContainerDied","Data":"40815833fa0990eb933830d7c29c34fdf89f68c098f6d02445d74910128e0333"} Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.193369 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.371105 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2bb91fe-0bf5-4422-a58c-0677652f6baa-secret-volume\") pod \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.371833 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2bb91fe-0bf5-4422-a58c-0677652f6baa-config-volume\") pod \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.371879 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4vgf\" (UniqueName: \"kubernetes.io/projected/b2bb91fe-0bf5-4422-a58c-0677652f6baa-kube-api-access-j4vgf\") pod \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\" (UID: \"b2bb91fe-0bf5-4422-a58c-0677652f6baa\") " Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.377939 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2bb91fe-0bf5-4422-a58c-0677652f6baa-config-volume" (OuterVolumeSpecName: "config-volume") pod "b2bb91fe-0bf5-4422-a58c-0677652f6baa" (UID: "b2bb91fe-0bf5-4422-a58c-0677652f6baa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.388476 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2bb91fe-0bf5-4422-a58c-0677652f6baa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b2bb91fe-0bf5-4422-a58c-0677652f6baa" (UID: "b2bb91fe-0bf5-4422-a58c-0677652f6baa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.392585 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2bb91fe-0bf5-4422-a58c-0677652f6baa-kube-api-access-j4vgf" (OuterVolumeSpecName: "kube-api-access-j4vgf") pod "b2bb91fe-0bf5-4422-a58c-0677652f6baa" (UID: "b2bb91fe-0bf5-4422-a58c-0677652f6baa"). InnerVolumeSpecName "kube-api-access-j4vgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.474968 4935 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b2bb91fe-0bf5-4422-a58c-0677652f6baa-config-volume\") on node \"crc\" DevicePath \"\"" Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.475380 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4vgf\" (UniqueName: \"kubernetes.io/projected/b2bb91fe-0bf5-4422-a58c-0677652f6baa-kube-api-access-j4vgf\") on node \"crc\" DevicePath \"\"" Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.475447 4935 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b2bb91fe-0bf5-4422-a58c-0677652f6baa-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.863379 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" event={"ID":"b2bb91fe-0bf5-4422-a58c-0677652f6baa","Type":"ContainerDied","Data":"b2a969d120be2481a5df8e82876588b4208209a594ea555a579844e222528291"} Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.863432 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2a969d120be2481a5df8e82876588b4208209a594ea555a579844e222528291" Dec 17 09:30:04 crc kubenswrapper[4935]: I1217 09:30:04.863477 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.072838 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fhlfb"] Dec 17 09:30:10 crc kubenswrapper[4935]: E1217 09:30:10.074246 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bb91fe-0bf5-4422-a58c-0677652f6baa" containerName="collect-profiles" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.074265 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bb91fe-0bf5-4422-a58c-0677652f6baa" containerName="collect-profiles" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.074599 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2bb91fe-0bf5-4422-a58c-0677652f6baa" containerName="collect-profiles" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.076758 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.082247 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhlfb"] Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.237795 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6cvx\" (UniqueName: \"kubernetes.io/projected/38a86b29-02d7-4452-ada0-b76ccf29b2e6-kube-api-access-m6cvx\") pod \"community-operators-fhlfb\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.237918 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-catalog-content\") pod \"community-operators-fhlfb\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.237985 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-utilities\") pod \"community-operators-fhlfb\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.340207 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6cvx\" (UniqueName: \"kubernetes.io/projected/38a86b29-02d7-4452-ada0-b76ccf29b2e6-kube-api-access-m6cvx\") pod \"community-operators-fhlfb\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.340318 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-catalog-content\") pod \"community-operators-fhlfb\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.340387 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-utilities\") pod \"community-operators-fhlfb\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.340908 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-utilities\") pod \"community-operators-fhlfb\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.341064 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-catalog-content\") pod \"community-operators-fhlfb\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.364465 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6cvx\" (UniqueName: \"kubernetes.io/projected/38a86b29-02d7-4452-ada0-b76ccf29b2e6-kube-api-access-m6cvx\") pod \"community-operators-fhlfb\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.400522 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:10 crc kubenswrapper[4935]: I1217 09:30:10.970977 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fhlfb"] Dec 17 09:30:11 crc kubenswrapper[4935]: I1217 09:30:11.932184 4935 generic.go:334] "Generic (PLEG): container finished" podID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerID="f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960" exitCode=0 Dec 17 09:30:11 crc kubenswrapper[4935]: I1217 09:30:11.932251 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhlfb" event={"ID":"38a86b29-02d7-4452-ada0-b76ccf29b2e6","Type":"ContainerDied","Data":"f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960"} Dec 17 09:30:11 crc kubenswrapper[4935]: I1217 09:30:11.932333 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhlfb" event={"ID":"38a86b29-02d7-4452-ada0-b76ccf29b2e6","Type":"ContainerStarted","Data":"a062e80f898c6d7157cd4a6dc486b718d66da673ad4c9733d7e57e31e9957000"} Dec 17 09:30:13 crc kubenswrapper[4935]: I1217 09:30:13.993851 4935 generic.go:334] "Generic (PLEG): container finished" podID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerID="2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867" exitCode=0 Dec 17 09:30:13 crc kubenswrapper[4935]: I1217 09:30:13.994534 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhlfb" event={"ID":"38a86b29-02d7-4452-ada0-b76ccf29b2e6","Type":"ContainerDied","Data":"2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867"} Dec 17 09:30:16 crc kubenswrapper[4935]: I1217 09:30:16.012813 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhlfb" event={"ID":"38a86b29-02d7-4452-ada0-b76ccf29b2e6","Type":"ContainerStarted","Data":"3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc"} Dec 17 09:30:16 crc kubenswrapper[4935]: I1217 09:30:16.041112 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fhlfb" podStartSLOduration=2.9136520089999998 podStartE2EDuration="6.041094194s" podCreationTimestamp="2025-12-17 09:30:10 +0000 UTC" firstStartedPulling="2025-12-17 09:30:11.934526742 +0000 UTC m=+1531.594367505" lastFinishedPulling="2025-12-17 09:30:15.061968927 +0000 UTC m=+1534.721809690" observedRunningTime="2025-12-17 09:30:16.038203704 +0000 UTC m=+1535.698044507" watchObservedRunningTime="2025-12-17 09:30:16.041094194 +0000 UTC m=+1535.700934957" Dec 17 09:30:20 crc kubenswrapper[4935]: I1217 09:30:20.401306 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:20 crc kubenswrapper[4935]: I1217 09:30:20.401992 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:20 crc kubenswrapper[4935]: I1217 09:30:20.452791 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:21 crc kubenswrapper[4935]: I1217 09:30:21.103365 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:21 crc kubenswrapper[4935]: I1217 09:30:21.154196 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhlfb"] Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.075932 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fhlfb" podUID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerName="registry-server" containerID="cri-o://3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc" gracePeriod=2 Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.576826 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.658793 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-catalog-content\") pod \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.658936 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-utilities\") pod \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.659144 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6cvx\" (UniqueName: \"kubernetes.io/projected/38a86b29-02d7-4452-ada0-b76ccf29b2e6-kube-api-access-m6cvx\") pod \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\" (UID: \"38a86b29-02d7-4452-ada0-b76ccf29b2e6\") " Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.660031 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-utilities" (OuterVolumeSpecName: "utilities") pod "38a86b29-02d7-4452-ada0-b76ccf29b2e6" (UID: "38a86b29-02d7-4452-ada0-b76ccf29b2e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.669219 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38a86b29-02d7-4452-ada0-b76ccf29b2e6-kube-api-access-m6cvx" (OuterVolumeSpecName: "kube-api-access-m6cvx") pod "38a86b29-02d7-4452-ada0-b76ccf29b2e6" (UID: "38a86b29-02d7-4452-ada0-b76ccf29b2e6"). InnerVolumeSpecName "kube-api-access-m6cvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.718776 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38a86b29-02d7-4452-ada0-b76ccf29b2e6" (UID: "38a86b29-02d7-4452-ada0-b76ccf29b2e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.761693 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.761744 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38a86b29-02d7-4452-ada0-b76ccf29b2e6-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:30:23 crc kubenswrapper[4935]: I1217 09:30:23.761755 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6cvx\" (UniqueName: \"kubernetes.io/projected/38a86b29-02d7-4452-ada0-b76ccf29b2e6-kube-api-access-m6cvx\") on node \"crc\" DevicePath \"\"" Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.087696 4935 generic.go:334] "Generic (PLEG): container finished" podID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerID="3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc" exitCode=0 Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.087745 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhlfb" event={"ID":"38a86b29-02d7-4452-ada0-b76ccf29b2e6","Type":"ContainerDied","Data":"3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc"} Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.088187 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fhlfb" event={"ID":"38a86b29-02d7-4452-ada0-b76ccf29b2e6","Type":"ContainerDied","Data":"a062e80f898c6d7157cd4a6dc486b718d66da673ad4c9733d7e57e31e9957000"} Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.088213 4935 scope.go:117] "RemoveContainer" containerID="3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc" Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.087803 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fhlfb" Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.126393 4935 scope.go:117] "RemoveContainer" containerID="2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867" Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.130890 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fhlfb"] Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.142107 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fhlfb"] Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.158032 4935 scope.go:117] "RemoveContainer" containerID="f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960" Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.217590 4935 scope.go:117] "RemoveContainer" containerID="3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc" Dec 17 09:30:24 crc kubenswrapper[4935]: E1217 09:30:24.219118 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc\": container with ID starting with 3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc not found: ID does not exist" containerID="3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc" Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.219191 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc"} err="failed to get container status \"3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc\": rpc error: code = NotFound desc = could not find container \"3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc\": container with ID starting with 3f6ca0bfcf587593ba37e093430482882a468edf94315fbe47c0864d8d8533bc not found: ID does not exist" Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.219231 4935 scope.go:117] "RemoveContainer" containerID="2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867" Dec 17 09:30:24 crc kubenswrapper[4935]: E1217 09:30:24.219749 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867\": container with ID starting with 2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867 not found: ID does not exist" containerID="2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867" Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.219787 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867"} err="failed to get container status \"2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867\": rpc error: code = NotFound desc = could not find container \"2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867\": container with ID starting with 2681815fdb187f3f702adabbac389747e38e4557bbb044056e21491729221867 not found: ID does not exist" Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.219812 4935 scope.go:117] "RemoveContainer" containerID="f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960" Dec 17 09:30:24 crc kubenswrapper[4935]: E1217 09:30:24.220426 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960\": container with ID starting with f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960 not found: ID does not exist" containerID="f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960" Dec 17 09:30:24 crc kubenswrapper[4935]: I1217 09:30:24.220472 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960"} err="failed to get container status \"f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960\": rpc error: code = NotFound desc = could not find container \"f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960\": container with ID starting with f2c9f75bd24ccc712dd4ecf2632d315474ad1fde752a0c34350cff7e7ac0d960 not found: ID does not exist" Dec 17 09:30:25 crc kubenswrapper[4935]: I1217 09:30:25.136676 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" path="/var/lib/kubelet/pods/38a86b29-02d7-4452-ada0-b76ccf29b2e6/volumes" Dec 17 09:30:30 crc kubenswrapper[4935]: I1217 09:30:30.130646 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:30:30 crc kubenswrapper[4935]: I1217 09:30:30.131252 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:30:43 crc kubenswrapper[4935]: I1217 09:30:43.022252 4935 scope.go:117] "RemoveContainer" containerID="bff53f3f3f28cf1756acd22727f439be8690f1448e706dcfcc2e884c82114623" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.519147 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zx6kv"] Dec 17 09:30:50 crc kubenswrapper[4935]: E1217 09:30:50.523347 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerName="extract-content" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.523491 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerName="extract-content" Dec 17 09:30:50 crc kubenswrapper[4935]: E1217 09:30:50.523598 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerName="extract-utilities" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.523679 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerName="extract-utilities" Dec 17 09:30:50 crc kubenswrapper[4935]: E1217 09:30:50.523781 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerName="registry-server" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.523862 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerName="registry-server" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.524178 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="38a86b29-02d7-4452-ada0-b76ccf29b2e6" containerName="registry-server" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.526162 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.533188 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zx6kv"] Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.690578 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-utilities\") pod \"certified-operators-zx6kv\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.690735 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hjmx\" (UniqueName: \"kubernetes.io/projected/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-kube-api-access-9hjmx\") pod \"certified-operators-zx6kv\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.690766 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-catalog-content\") pod \"certified-operators-zx6kv\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.792738 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-utilities\") pod \"certified-operators-zx6kv\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.792868 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hjmx\" (UniqueName: \"kubernetes.io/projected/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-kube-api-access-9hjmx\") pod \"certified-operators-zx6kv\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.792897 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-catalog-content\") pod \"certified-operators-zx6kv\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.793370 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-utilities\") pod \"certified-operators-zx6kv\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.793567 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-catalog-content\") pod \"certified-operators-zx6kv\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.818224 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hjmx\" (UniqueName: \"kubernetes.io/projected/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-kube-api-access-9hjmx\") pod \"certified-operators-zx6kv\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:50 crc kubenswrapper[4935]: I1217 09:30:50.876079 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:30:51 crc kubenswrapper[4935]: I1217 09:30:51.443528 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zx6kv"] Dec 17 09:30:52 crc kubenswrapper[4935]: I1217 09:30:52.402693 4935 generic.go:334] "Generic (PLEG): container finished" podID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerID="2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293" exitCode=0 Dec 17 09:30:52 crc kubenswrapper[4935]: I1217 09:30:52.402771 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx6kv" event={"ID":"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a","Type":"ContainerDied","Data":"2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293"} Dec 17 09:30:52 crc kubenswrapper[4935]: I1217 09:30:52.403084 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx6kv" event={"ID":"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a","Type":"ContainerStarted","Data":"12cb0a44dcb0ece869933c2eb9fae51888f0bfd24a408a443e2eaaa808c2f523"} Dec 17 09:30:54 crc kubenswrapper[4935]: I1217 09:30:54.437775 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx6kv" event={"ID":"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a","Type":"ContainerStarted","Data":"8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f"} Dec 17 09:30:55 crc kubenswrapper[4935]: I1217 09:30:55.451090 4935 generic.go:334] "Generic (PLEG): container finished" podID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerID="8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f" exitCode=0 Dec 17 09:30:55 crc kubenswrapper[4935]: I1217 09:30:55.451183 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx6kv" event={"ID":"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a","Type":"ContainerDied","Data":"8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f"} Dec 17 09:30:56 crc kubenswrapper[4935]: I1217 09:30:56.463979 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx6kv" event={"ID":"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a","Type":"ContainerStarted","Data":"294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76"} Dec 17 09:30:56 crc kubenswrapper[4935]: I1217 09:30:56.496885 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zx6kv" podStartSLOduration=3.009091367 podStartE2EDuration="6.496858599s" podCreationTimestamp="2025-12-17 09:30:50 +0000 UTC" firstStartedPulling="2025-12-17 09:30:52.406119697 +0000 UTC m=+1572.065960470" lastFinishedPulling="2025-12-17 09:30:55.893886939 +0000 UTC m=+1575.553727702" observedRunningTime="2025-12-17 09:30:56.48616865 +0000 UTC m=+1576.146009413" watchObservedRunningTime="2025-12-17 09:30:56.496858599 +0000 UTC m=+1576.156699362" Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.131114 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.131824 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.131867 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.132331 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.132411 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" gracePeriod=600 Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.511408 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" exitCode=0 Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.511486 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9"} Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.511533 4935 scope.go:117] "RemoveContainer" containerID="65dd41ad94dd7bae1b7cbbd3c318eb23617db601045887b6ac1ed745fc1e5001" Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.876698 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.877237 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:31:00 crc kubenswrapper[4935]: I1217 09:31:00.941109 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:31:01 crc kubenswrapper[4935]: E1217 09:31:01.041153 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:31:01 crc kubenswrapper[4935]: I1217 09:31:01.528099 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:31:01 crc kubenswrapper[4935]: E1217 09:31:01.528479 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:31:01 crc kubenswrapper[4935]: I1217 09:31:01.590749 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:31:01 crc kubenswrapper[4935]: I1217 09:31:01.646979 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zx6kv"] Dec 17 09:31:03 crc kubenswrapper[4935]: I1217 09:31:03.546981 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zx6kv" podUID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerName="registry-server" containerID="cri-o://294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76" gracePeriod=2 Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.020381 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.214510 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-utilities\") pod \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.214662 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hjmx\" (UniqueName: \"kubernetes.io/projected/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-kube-api-access-9hjmx\") pod \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.214770 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-catalog-content\") pod \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\" (UID: \"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a\") " Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.215668 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-utilities" (OuterVolumeSpecName: "utilities") pod "6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" (UID: "6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.224873 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-kube-api-access-9hjmx" (OuterVolumeSpecName: "kube-api-access-9hjmx") pod "6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" (UID: "6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a"). InnerVolumeSpecName "kube-api-access-9hjmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.271698 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" (UID: "6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.320560 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.320627 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hjmx\" (UniqueName: \"kubernetes.io/projected/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-kube-api-access-9hjmx\") on node \"crc\" DevicePath \"\"" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.320646 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.559621 4935 generic.go:334] "Generic (PLEG): container finished" podID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerID="294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76" exitCode=0 Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.559683 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx6kv" event={"ID":"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a","Type":"ContainerDied","Data":"294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76"} Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.559734 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zx6kv" event={"ID":"6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a","Type":"ContainerDied","Data":"12cb0a44dcb0ece869933c2eb9fae51888f0bfd24a408a443e2eaaa808c2f523"} Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.559761 4935 scope.go:117] "RemoveContainer" containerID="294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.559700 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zx6kv" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.592752 4935 scope.go:117] "RemoveContainer" containerID="8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.608136 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zx6kv"] Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.621374 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zx6kv"] Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.625102 4935 scope.go:117] "RemoveContainer" containerID="2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.685943 4935 scope.go:117] "RemoveContainer" containerID="294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76" Dec 17 09:31:04 crc kubenswrapper[4935]: E1217 09:31:04.686590 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76\": container with ID starting with 294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76 not found: ID does not exist" containerID="294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.686654 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76"} err="failed to get container status \"294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76\": rpc error: code = NotFound desc = could not find container \"294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76\": container with ID starting with 294e83103b457aa6bc6315f0cb89dfd0a9c24f4c6aeba346771bc67607666f76 not found: ID does not exist" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.686701 4935 scope.go:117] "RemoveContainer" containerID="8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f" Dec 17 09:31:04 crc kubenswrapper[4935]: E1217 09:31:04.687229 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f\": container with ID starting with 8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f not found: ID does not exist" containerID="8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.687459 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f"} err="failed to get container status \"8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f\": rpc error: code = NotFound desc = could not find container \"8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f\": container with ID starting with 8114b0292130e4a0520394e85c2f44429ecbde59fba7b6b4916ac6dd0b20747f not found: ID does not exist" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.687594 4935 scope.go:117] "RemoveContainer" containerID="2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293" Dec 17 09:31:04 crc kubenswrapper[4935]: E1217 09:31:04.688035 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293\": container with ID starting with 2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293 not found: ID does not exist" containerID="2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293" Dec 17 09:31:04 crc kubenswrapper[4935]: I1217 09:31:04.688059 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293"} err="failed to get container status \"2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293\": rpc error: code = NotFound desc = could not find container \"2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293\": container with ID starting with 2b2aaf5e954aa96393e82ee8ceb207ff113bee866a5a12c74a7e9dcfd1d02293 not found: ID does not exist" Dec 17 09:31:05 crc kubenswrapper[4935]: I1217 09:31:05.142326 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" path="/var/lib/kubelet/pods/6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a/volumes" Dec 17 09:31:13 crc kubenswrapper[4935]: I1217 09:31:13.125684 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:31:13 crc kubenswrapper[4935]: E1217 09:31:13.126578 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:31:27 crc kubenswrapper[4935]: I1217 09:31:27.124726 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:31:27 crc kubenswrapper[4935]: E1217 09:31:27.125934 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:31:33 crc kubenswrapper[4935]: I1217 09:31:33.952929 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n6rzr"] Dec 17 09:31:33 crc kubenswrapper[4935]: E1217 09:31:33.954240 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerName="extract-utilities" Dec 17 09:31:33 crc kubenswrapper[4935]: I1217 09:31:33.954288 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerName="extract-utilities" Dec 17 09:31:33 crc kubenswrapper[4935]: E1217 09:31:33.954301 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerName="extract-content" Dec 17 09:31:33 crc kubenswrapper[4935]: I1217 09:31:33.954313 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerName="extract-content" Dec 17 09:31:33 crc kubenswrapper[4935]: E1217 09:31:33.954344 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerName="registry-server" Dec 17 09:31:33 crc kubenswrapper[4935]: I1217 09:31:33.954353 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerName="registry-server" Dec 17 09:31:33 crc kubenswrapper[4935]: I1217 09:31:33.954604 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b75bbb4-6cda-43fa-b48d-de40ac0b4f4a" containerName="registry-server" Dec 17 09:31:33 crc kubenswrapper[4935]: I1217 09:31:33.956560 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:33 crc kubenswrapper[4935]: I1217 09:31:33.977814 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n6rzr"] Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.069369 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5tvb\" (UniqueName: \"kubernetes.io/projected/ee325b7c-ceac-4a25-af5a-0828173376d5-kube-api-access-h5tvb\") pod \"redhat-marketplace-n6rzr\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.069445 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-utilities\") pod \"redhat-marketplace-n6rzr\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.069494 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-catalog-content\") pod \"redhat-marketplace-n6rzr\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.171424 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-utilities\") pod \"redhat-marketplace-n6rzr\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.171843 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-catalog-content\") pod \"redhat-marketplace-n6rzr\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.172017 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5tvb\" (UniqueName: \"kubernetes.io/projected/ee325b7c-ceac-4a25-af5a-0828173376d5-kube-api-access-h5tvb\") pod \"redhat-marketplace-n6rzr\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.172250 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-utilities\") pod \"redhat-marketplace-n6rzr\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.172559 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-catalog-content\") pod \"redhat-marketplace-n6rzr\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.201308 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5tvb\" (UniqueName: \"kubernetes.io/projected/ee325b7c-ceac-4a25-af5a-0828173376d5-kube-api-access-h5tvb\") pod \"redhat-marketplace-n6rzr\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.280225 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.780883 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n6rzr"] Dec 17 09:31:34 crc kubenswrapper[4935]: I1217 09:31:34.868872 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n6rzr" event={"ID":"ee325b7c-ceac-4a25-af5a-0828173376d5","Type":"ContainerStarted","Data":"0ac403460986877339f1e80da50f8f64bb9595cf17209ecf7666c377ce5a3745"} Dec 17 09:31:35 crc kubenswrapper[4935]: I1217 09:31:35.880344 4935 generic.go:334] "Generic (PLEG): container finished" podID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerID="688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe" exitCode=0 Dec 17 09:31:35 crc kubenswrapper[4935]: I1217 09:31:35.880476 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n6rzr" event={"ID":"ee325b7c-ceac-4a25-af5a-0828173376d5","Type":"ContainerDied","Data":"688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe"} Dec 17 09:31:37 crc kubenswrapper[4935]: I1217 09:31:37.900391 4935 generic.go:334] "Generic (PLEG): container finished" podID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerID="9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2" exitCode=0 Dec 17 09:31:37 crc kubenswrapper[4935]: I1217 09:31:37.900485 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n6rzr" event={"ID":"ee325b7c-ceac-4a25-af5a-0828173376d5","Type":"ContainerDied","Data":"9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2"} Dec 17 09:31:39 crc kubenswrapper[4935]: I1217 09:31:39.124869 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:31:39 crc kubenswrapper[4935]: E1217 09:31:39.125719 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:31:39 crc kubenswrapper[4935]: I1217 09:31:39.920217 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n6rzr" event={"ID":"ee325b7c-ceac-4a25-af5a-0828173376d5","Type":"ContainerStarted","Data":"46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c"} Dec 17 09:31:39 crc kubenswrapper[4935]: I1217 09:31:39.944054 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n6rzr" podStartSLOduration=4.116511038 podStartE2EDuration="6.944034823s" podCreationTimestamp="2025-12-17 09:31:33 +0000 UTC" firstStartedPulling="2025-12-17 09:31:35.88377633 +0000 UTC m=+1615.543617123" lastFinishedPulling="2025-12-17 09:31:38.711300145 +0000 UTC m=+1618.371140908" observedRunningTime="2025-12-17 09:31:39.939445242 +0000 UTC m=+1619.599286005" watchObservedRunningTime="2025-12-17 09:31:39.944034823 +0000 UTC m=+1619.603875586" Dec 17 09:31:44 crc kubenswrapper[4935]: I1217 09:31:44.280832 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:44 crc kubenswrapper[4935]: I1217 09:31:44.281478 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:44 crc kubenswrapper[4935]: I1217 09:31:44.334473 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:45 crc kubenswrapper[4935]: I1217 09:31:45.013585 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:45 crc kubenswrapper[4935]: I1217 09:31:45.062026 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n6rzr"] Dec 17 09:31:46 crc kubenswrapper[4935]: I1217 09:31:46.977546 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n6rzr" podUID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerName="registry-server" containerID="cri-o://46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c" gracePeriod=2 Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.426679 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.595521 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5tvb\" (UniqueName: \"kubernetes.io/projected/ee325b7c-ceac-4a25-af5a-0828173376d5-kube-api-access-h5tvb\") pod \"ee325b7c-ceac-4a25-af5a-0828173376d5\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.596090 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-utilities\") pod \"ee325b7c-ceac-4a25-af5a-0828173376d5\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.596150 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-catalog-content\") pod \"ee325b7c-ceac-4a25-af5a-0828173376d5\" (UID: \"ee325b7c-ceac-4a25-af5a-0828173376d5\") " Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.597033 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-utilities" (OuterVolumeSpecName: "utilities") pod "ee325b7c-ceac-4a25-af5a-0828173376d5" (UID: "ee325b7c-ceac-4a25-af5a-0828173376d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.608063 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee325b7c-ceac-4a25-af5a-0828173376d5-kube-api-access-h5tvb" (OuterVolumeSpecName: "kube-api-access-h5tvb") pod "ee325b7c-ceac-4a25-af5a-0828173376d5" (UID: "ee325b7c-ceac-4a25-af5a-0828173376d5"). InnerVolumeSpecName "kube-api-access-h5tvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.618408 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee325b7c-ceac-4a25-af5a-0828173376d5" (UID: "ee325b7c-ceac-4a25-af5a-0828173376d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.698916 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.698951 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5tvb\" (UniqueName: \"kubernetes.io/projected/ee325b7c-ceac-4a25-af5a-0828173376d5-kube-api-access-h5tvb\") on node \"crc\" DevicePath \"\"" Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.698967 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee325b7c-ceac-4a25-af5a-0828173376d5-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.992621 4935 generic.go:334] "Generic (PLEG): container finished" podID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerID="46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c" exitCode=0 Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.992688 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n6rzr" event={"ID":"ee325b7c-ceac-4a25-af5a-0828173376d5","Type":"ContainerDied","Data":"46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c"} Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.992724 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n6rzr" event={"ID":"ee325b7c-ceac-4a25-af5a-0828173376d5","Type":"ContainerDied","Data":"0ac403460986877339f1e80da50f8f64bb9595cf17209ecf7666c377ce5a3745"} Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.992722 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n6rzr" Dec 17 09:31:47 crc kubenswrapper[4935]: I1217 09:31:47.992741 4935 scope.go:117] "RemoveContainer" containerID="46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c" Dec 17 09:31:48 crc kubenswrapper[4935]: I1217 09:31:48.016519 4935 scope.go:117] "RemoveContainer" containerID="9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2" Dec 17 09:31:48 crc kubenswrapper[4935]: I1217 09:31:48.039845 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n6rzr"] Dec 17 09:31:48 crc kubenswrapper[4935]: I1217 09:31:48.051010 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n6rzr"] Dec 17 09:31:48 crc kubenswrapper[4935]: I1217 09:31:48.062036 4935 scope.go:117] "RemoveContainer" containerID="688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe" Dec 17 09:31:48 crc kubenswrapper[4935]: I1217 09:31:48.104192 4935 scope.go:117] "RemoveContainer" containerID="46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c" Dec 17 09:31:48 crc kubenswrapper[4935]: E1217 09:31:48.104969 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c\": container with ID starting with 46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c not found: ID does not exist" containerID="46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c" Dec 17 09:31:48 crc kubenswrapper[4935]: I1217 09:31:48.105255 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c"} err="failed to get container status \"46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c\": rpc error: code = NotFound desc = could not find container \"46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c\": container with ID starting with 46777aace36b381329422ac90a97ffe5aee16b2213ddb5433e6aa05c08aabb0c not found: ID does not exist" Dec 17 09:31:48 crc kubenswrapper[4935]: I1217 09:31:48.105373 4935 scope.go:117] "RemoveContainer" containerID="9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2" Dec 17 09:31:48 crc kubenswrapper[4935]: E1217 09:31:48.105832 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2\": container with ID starting with 9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2 not found: ID does not exist" containerID="9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2" Dec 17 09:31:48 crc kubenswrapper[4935]: I1217 09:31:48.105854 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2"} err="failed to get container status \"9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2\": rpc error: code = NotFound desc = could not find container \"9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2\": container with ID starting with 9ad9ca739acf4e20ddaf63304a2ae46371333f4892cddf92b40bf1d433efeef2 not found: ID does not exist" Dec 17 09:31:48 crc kubenswrapper[4935]: I1217 09:31:48.105869 4935 scope.go:117] "RemoveContainer" containerID="688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe" Dec 17 09:31:48 crc kubenswrapper[4935]: E1217 09:31:48.106538 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe\": container with ID starting with 688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe not found: ID does not exist" containerID="688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe" Dec 17 09:31:48 crc kubenswrapper[4935]: I1217 09:31:48.106584 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe"} err="failed to get container status \"688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe\": rpc error: code = NotFound desc = could not find container \"688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe\": container with ID starting with 688642e24ba1e4b2dbb48bd3e69d68884a6fc5e0c8c0b2ff1e89bfdd100bdcfe not found: ID does not exist" Dec 17 09:31:49 crc kubenswrapper[4935]: I1217 09:31:49.135447 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee325b7c-ceac-4a25-af5a-0828173376d5" path="/var/lib/kubelet/pods/ee325b7c-ceac-4a25-af5a-0828173376d5/volumes" Dec 17 09:31:51 crc kubenswrapper[4935]: I1217 09:31:51.134890 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:31:51 crc kubenswrapper[4935]: E1217 09:31:51.135193 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:32:05 crc kubenswrapper[4935]: I1217 09:32:05.124949 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:32:05 crc kubenswrapper[4935]: E1217 09:32:05.126828 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:32:18 crc kubenswrapper[4935]: I1217 09:32:18.126225 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:32:18 crc kubenswrapper[4935]: E1217 09:32:18.127904 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:32:30 crc kubenswrapper[4935]: I1217 09:32:30.125356 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:32:30 crc kubenswrapper[4935]: E1217 09:32:30.126711 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:32:42 crc kubenswrapper[4935]: I1217 09:32:42.124773 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:32:42 crc kubenswrapper[4935]: E1217 09:32:42.125949 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:32:50 crc kubenswrapper[4935]: I1217 09:32:50.665861 4935 generic.go:334] "Generic (PLEG): container finished" podID="1b3c1c73-3f87-4383-9d09-1931001f0629" containerID="3d19c5e7ea47a0f5b2a81a48e0295d0281e4f4f6320f053dc46e0e51bff41958" exitCode=0 Dec 17 09:32:50 crc kubenswrapper[4935]: I1217 09:32:50.665938 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" event={"ID":"1b3c1c73-3f87-4383-9d09-1931001f0629","Type":"ContainerDied","Data":"3d19c5e7ea47a0f5b2a81a48e0295d0281e4f4f6320f053dc46e0e51bff41958"} Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.080854 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.234240 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gj9m\" (UniqueName: \"kubernetes.io/projected/1b3c1c73-3f87-4383-9d09-1931001f0629-kube-api-access-8gj9m\") pod \"1b3c1c73-3f87-4383-9d09-1931001f0629\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.234373 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-bootstrap-combined-ca-bundle\") pod \"1b3c1c73-3f87-4383-9d09-1931001f0629\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.234533 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-ssh-key\") pod \"1b3c1c73-3f87-4383-9d09-1931001f0629\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.234646 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-inventory\") pod \"1b3c1c73-3f87-4383-9d09-1931001f0629\" (UID: \"1b3c1c73-3f87-4383-9d09-1931001f0629\") " Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.242399 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1b3c1c73-3f87-4383-9d09-1931001f0629" (UID: "1b3c1c73-3f87-4383-9d09-1931001f0629"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.252059 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3c1c73-3f87-4383-9d09-1931001f0629-kube-api-access-8gj9m" (OuterVolumeSpecName: "kube-api-access-8gj9m") pod "1b3c1c73-3f87-4383-9d09-1931001f0629" (UID: "1b3c1c73-3f87-4383-9d09-1931001f0629"). InnerVolumeSpecName "kube-api-access-8gj9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.263973 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-inventory" (OuterVolumeSpecName: "inventory") pod "1b3c1c73-3f87-4383-9d09-1931001f0629" (UID: "1b3c1c73-3f87-4383-9d09-1931001f0629"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.264883 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1b3c1c73-3f87-4383-9d09-1931001f0629" (UID: "1b3c1c73-3f87-4383-9d09-1931001f0629"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.339224 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gj9m\" (UniqueName: \"kubernetes.io/projected/1b3c1c73-3f87-4383-9d09-1931001f0629-kube-api-access-8gj9m\") on node \"crc\" DevicePath \"\"" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.339264 4935 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.339297 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.339309 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1b3c1c73-3f87-4383-9d09-1931001f0629-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.695336 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" event={"ID":"1b3c1c73-3f87-4383-9d09-1931001f0629","Type":"ContainerDied","Data":"a56b13cfd4ded465f995e7ba9570b6dad17d37629de903c7ae23838148397512"} Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.695386 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a56b13cfd4ded465f995e7ba9570b6dad17d37629de903c7ae23838148397512" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.695444 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.791659 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6"] Dec 17 09:32:52 crc kubenswrapper[4935]: E1217 09:32:52.792412 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerName="extract-content" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.792443 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerName="extract-content" Dec 17 09:32:52 crc kubenswrapper[4935]: E1217 09:32:52.792473 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerName="registry-server" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.792483 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerName="registry-server" Dec 17 09:32:52 crc kubenswrapper[4935]: E1217 09:32:52.792518 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerName="extract-utilities" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.792530 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerName="extract-utilities" Dec 17 09:32:52 crc kubenswrapper[4935]: E1217 09:32:52.792548 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b3c1c73-3f87-4383-9d09-1931001f0629" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.792562 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3c1c73-3f87-4383-9d09-1931001f0629" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.792771 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee325b7c-ceac-4a25-af5a-0828173376d5" containerName="registry-server" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.792797 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b3c1c73-3f87-4383-9d09-1931001f0629" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.793721 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.796365 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.796877 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.797045 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.798014 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.803260 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6"] Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.861078 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkmwl\" (UniqueName: \"kubernetes.io/projected/9a7d6590-bf03-479a-a094-259dd4efafef-kube-api-access-xkmwl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8drs6\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.861443 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8drs6\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.861703 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8drs6\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.964532 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8drs6\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.964720 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8drs6\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.965061 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkmwl\" (UniqueName: \"kubernetes.io/projected/9a7d6590-bf03-479a-a094-259dd4efafef-kube-api-access-xkmwl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8drs6\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.972085 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8drs6\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.985834 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8drs6\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:52 crc kubenswrapper[4935]: I1217 09:32:52.990631 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkmwl\" (UniqueName: \"kubernetes.io/projected/9a7d6590-bf03-479a-a094-259dd4efafef-kube-api-access-xkmwl\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8drs6\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:53 crc kubenswrapper[4935]: I1217 09:32:53.163448 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:32:53 crc kubenswrapper[4935]: I1217 09:32:53.733062 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6"] Dec 17 09:32:53 crc kubenswrapper[4935]: I1217 09:32:53.736829 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 09:32:54 crc kubenswrapper[4935]: I1217 09:32:54.714916 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" event={"ID":"9a7d6590-bf03-479a-a094-259dd4efafef","Type":"ContainerStarted","Data":"2d95ad1af7a97019b9312bae20582daa8d2d369a1b3662b25b9d5c802421dd81"} Dec 17 09:32:55 crc kubenswrapper[4935]: I1217 09:32:55.730916 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" event={"ID":"9a7d6590-bf03-479a-a094-259dd4efafef","Type":"ContainerStarted","Data":"46ce482a0ce2037f67b26f82c3349ed3e737c29f6ef16b263d31be5254fb8905"} Dec 17 09:32:55 crc kubenswrapper[4935]: I1217 09:32:55.751679 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" podStartSLOduration=2.033465581 podStartE2EDuration="3.751657968s" podCreationTimestamp="2025-12-17 09:32:52 +0000 UTC" firstStartedPulling="2025-12-17 09:32:53.736612756 +0000 UTC m=+1693.396453519" lastFinishedPulling="2025-12-17 09:32:55.454805143 +0000 UTC m=+1695.114645906" observedRunningTime="2025-12-17 09:32:55.746785632 +0000 UTC m=+1695.406626415" watchObservedRunningTime="2025-12-17 09:32:55.751657968 +0000 UTC m=+1695.411498731" Dec 17 09:32:56 crc kubenswrapper[4935]: I1217 09:32:56.124769 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:32:56 crc kubenswrapper[4935]: E1217 09:32:56.125403 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:33:09 crc kubenswrapper[4935]: I1217 09:33:09.124488 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:33:09 crc kubenswrapper[4935]: E1217 09:33:09.125346 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:33:20 crc kubenswrapper[4935]: I1217 09:33:20.047479 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-5d06-account-create-update-wwsz6"] Dec 17 09:33:20 crc kubenswrapper[4935]: I1217 09:33:20.061228 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-2dr9n"] Dec 17 09:33:20 crc kubenswrapper[4935]: I1217 09:33:20.070989 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-2dr9n"] Dec 17 09:33:20 crc kubenswrapper[4935]: I1217 09:33:20.080880 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-5d06-account-create-update-wwsz6"] Dec 17 09:33:21 crc kubenswrapper[4935]: I1217 09:33:21.136247 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd983b91-5108-4717-a2d2-4324cbd041eb" path="/var/lib/kubelet/pods/cd983b91-5108-4717-a2d2-4324cbd041eb/volumes" Dec 17 09:33:21 crc kubenswrapper[4935]: I1217 09:33:21.137194 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9d6a1ba-bf06-4798-87c6-8980f387fe14" path="/var/lib/kubelet/pods/e9d6a1ba-bf06-4798-87c6-8980f387fe14/volumes" Dec 17 09:33:24 crc kubenswrapper[4935]: I1217 09:33:24.034166 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-9dfe-account-create-update-wk5hj"] Dec 17 09:33:24 crc kubenswrapper[4935]: I1217 09:33:24.044642 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-1418-account-create-update-lwlzp"] Dec 17 09:33:24 crc kubenswrapper[4935]: I1217 09:33:24.057732 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-9dfe-account-create-update-wk5hj"] Dec 17 09:33:24 crc kubenswrapper[4935]: I1217 09:33:24.070292 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ggjn4"] Dec 17 09:33:24 crc kubenswrapper[4935]: I1217 09:33:24.081440 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-1418-account-create-update-lwlzp"] Dec 17 09:33:24 crc kubenswrapper[4935]: I1217 09:33:24.091401 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-7k42n"] Dec 17 09:33:24 crc kubenswrapper[4935]: I1217 09:33:24.098660 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-7k42n"] Dec 17 09:33:24 crc kubenswrapper[4935]: I1217 09:33:24.105892 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ggjn4"] Dec 17 09:33:24 crc kubenswrapper[4935]: I1217 09:33:24.124604 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:33:24 crc kubenswrapper[4935]: E1217 09:33:24.124885 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:33:25 crc kubenswrapper[4935]: I1217 09:33:25.143087 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485a1677-2580-4128-8711-74c1136c0716" path="/var/lib/kubelet/pods/485a1677-2580-4128-8711-74c1136c0716/volumes" Dec 17 09:33:25 crc kubenswrapper[4935]: I1217 09:33:25.144040 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71ed76ae-907f-4693-a696-1d43ee6fb5e2" path="/var/lib/kubelet/pods/71ed76ae-907f-4693-a696-1d43ee6fb5e2/volumes" Dec 17 09:33:25 crc kubenswrapper[4935]: I1217 09:33:25.144868 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75555da0-8b97-47fa-8851-3adb9fa308ec" path="/var/lib/kubelet/pods/75555da0-8b97-47fa-8851-3adb9fa308ec/volumes" Dec 17 09:33:25 crc kubenswrapper[4935]: I1217 09:33:25.145642 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98ee9cd1-7c28-49c0-89f6-d1f5e70ce023" path="/var/lib/kubelet/pods/98ee9cd1-7c28-49c0-89f6-d1f5e70ce023/volumes" Dec 17 09:33:38 crc kubenswrapper[4935]: I1217 09:33:38.125056 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:33:38 crc kubenswrapper[4935]: E1217 09:33:38.126745 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:33:42 crc kubenswrapper[4935]: I1217 09:33:42.047225 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-z2qrd"] Dec 17 09:33:42 crc kubenswrapper[4935]: I1217 09:33:42.060438 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-z2qrd"] Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.045926 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-kxwws"] Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.059096 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-kxwws"] Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.137456 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="454cfd39-9bc9-4e7e-968b-7f4b67654fb0" path="/var/lib/kubelet/pods/454cfd39-9bc9-4e7e-968b-7f4b67654fb0/volumes" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.138134 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd84fa06-5321-450f-b602-7f09d571a6d6" path="/var/lib/kubelet/pods/cd84fa06-5321-450f-b602-7f09d571a6d6/volumes" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.212700 4935 scope.go:117] "RemoveContainer" containerID="4a349b57feb6c668a7410d3d6c44992ef03228947d63e42323ca94794ebf3dbf" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.245948 4935 scope.go:117] "RemoveContainer" containerID="66aa8232c6e694cf0fc325dbd7b5842044aaccb7720f9cf9a1d4c5fc5e172360" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.307069 4935 scope.go:117] "RemoveContainer" containerID="ffbc25a3cbb476037b3e5fb9136fc2ba3f2a832e84ed3f1f0228045f7688ef1f" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.334035 4935 scope.go:117] "RemoveContainer" containerID="36bab1ab0080ed8bdef02c4f3b3b63f52844437928e73de8803b2ba6ae52d29b" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.378567 4935 scope.go:117] "RemoveContainer" containerID="756d7751af282a1839230ccb7967e446b7b0e2dbeeee06d542db49825558de89" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.406617 4935 scope.go:117] "RemoveContainer" containerID="2e581075e1d1ff9f2f248f8ca217cfc19fc71618b2bb5c97e986069300ff193a" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.449916 4935 scope.go:117] "RemoveContainer" containerID="0f2fcc7c08fda2f08c8fbd662c24b964a943604b883d8d6e1d9385fde2d16777" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.478495 4935 scope.go:117] "RemoveContainer" containerID="b3c6a77bd41056590ad41afbad2be22148fc25ed8311caf463e52a2357372be8" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.527376 4935 scope.go:117] "RemoveContainer" containerID="992f1cb705bceb8a16a4ad9543ea505a37b782e1c7c3919c4f0c33ee631cfcd4" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.582550 4935 scope.go:117] "RemoveContainer" containerID="2c9962e9c53140e99e884b357832def79baa361e42273bd0224100d32e8f45ec" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.605475 4935 scope.go:117] "RemoveContainer" containerID="094c87efb5a078eb16f0e7e48a0388b82596e884343dba5137e653e3f7010fa4" Dec 17 09:33:43 crc kubenswrapper[4935]: I1217 09:33:43.644250 4935 scope.go:117] "RemoveContainer" containerID="6996a6849f33defe574dffd691d706db6665a93e36b385cf206811ab1230d102" Dec 17 09:33:44 crc kubenswrapper[4935]: I1217 09:33:44.052261 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6ff9-account-create-update-cqjmd"] Dec 17 09:33:44 crc kubenswrapper[4935]: I1217 09:33:44.063592 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-4rx9f"] Dec 17 09:33:44 crc kubenswrapper[4935]: I1217 09:33:44.075200 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-fcac-account-create-update-xwfk2"] Dec 17 09:33:44 crc kubenswrapper[4935]: I1217 09:33:44.085682 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-a108-account-create-update-5gp8l"] Dec 17 09:33:44 crc kubenswrapper[4935]: I1217 09:33:44.093923 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-4rx9f"] Dec 17 09:33:44 crc kubenswrapper[4935]: I1217 09:33:44.101262 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6ff9-account-create-update-cqjmd"] Dec 17 09:33:44 crc kubenswrapper[4935]: I1217 09:33:44.109823 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-fcac-account-create-update-xwfk2"] Dec 17 09:33:44 crc kubenswrapper[4935]: I1217 09:33:44.118931 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-a108-account-create-update-5gp8l"] Dec 17 09:33:45 crc kubenswrapper[4935]: I1217 09:33:45.140642 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1" path="/var/lib/kubelet/pods/7d24a9e3-6ddf-4cfa-9a10-dfed9bec9fa1/volumes" Dec 17 09:33:45 crc kubenswrapper[4935]: I1217 09:33:45.141534 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb24d5e-88f0-4613-9cad-16f2f3717cae" path="/var/lib/kubelet/pods/9cb24d5e-88f0-4613-9cad-16f2f3717cae/volumes" Dec 17 09:33:45 crc kubenswrapper[4935]: I1217 09:33:45.142330 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5c13b61-d034-4dfe-9632-f11b4210eba1" path="/var/lib/kubelet/pods/c5c13b61-d034-4dfe-9632-f11b4210eba1/volumes" Dec 17 09:33:45 crc kubenswrapper[4935]: I1217 09:33:45.143144 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6df98a0-3831-48eb-8a28-648be6ec3b08" path="/var/lib/kubelet/pods/f6df98a0-3831-48eb-8a28-648be6ec3b08/volumes" Dec 17 09:33:53 crc kubenswrapper[4935]: I1217 09:33:53.124098 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:33:53 crc kubenswrapper[4935]: E1217 09:33:53.124933 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:33:55 crc kubenswrapper[4935]: I1217 09:33:55.031898 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-bmvch"] Dec 17 09:33:55 crc kubenswrapper[4935]: I1217 09:33:55.040503 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-t4kgb"] Dec 17 09:33:55 crc kubenswrapper[4935]: I1217 09:33:55.049794 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-bmvch"] Dec 17 09:33:55 crc kubenswrapper[4935]: I1217 09:33:55.058538 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-t4kgb"] Dec 17 09:33:55 crc kubenswrapper[4935]: I1217 09:33:55.136440 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07" path="/var/lib/kubelet/pods/c3a0d4d5-7bf0-45c8-85bb-67fb1c465a07/volumes" Dec 17 09:33:55 crc kubenswrapper[4935]: I1217 09:33:55.137254 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4bd9ed6-70bb-4e14-90f1-4cb7488daf41" path="/var/lib/kubelet/pods/d4bd9ed6-70bb-4e14-90f1-4cb7488daf41/volumes" Dec 17 09:34:07 crc kubenswrapper[4935]: I1217 09:34:07.124912 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:34:07 crc kubenswrapper[4935]: E1217 09:34:07.125883 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:34:20 crc kubenswrapper[4935]: I1217 09:34:20.124117 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:34:20 crc kubenswrapper[4935]: E1217 09:34:20.125088 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:34:32 crc kubenswrapper[4935]: I1217 09:34:32.124975 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:34:32 crc kubenswrapper[4935]: E1217 09:34:32.125909 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:34:36 crc kubenswrapper[4935]: I1217 09:34:36.048986 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-g8ctb"] Dec 17 09:34:36 crc kubenswrapper[4935]: I1217 09:34:36.059030 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-g8ctb"] Dec 17 09:34:36 crc kubenswrapper[4935]: I1217 09:34:36.832305 4935 generic.go:334] "Generic (PLEG): container finished" podID="9a7d6590-bf03-479a-a094-259dd4efafef" containerID="46ce482a0ce2037f67b26f82c3349ed3e737c29f6ef16b263d31be5254fb8905" exitCode=0 Dec 17 09:34:36 crc kubenswrapper[4935]: I1217 09:34:36.832389 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" event={"ID":"9a7d6590-bf03-479a-a094-259dd4efafef","Type":"ContainerDied","Data":"46ce482a0ce2037f67b26f82c3349ed3e737c29f6ef16b263d31be5254fb8905"} Dec 17 09:34:37 crc kubenswrapper[4935]: I1217 09:34:37.135646 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e284fa-5f85-409e-bcb3-fcb2b320a0fe" path="/var/lib/kubelet/pods/99e284fa-5f85-409e-bcb3-fcb2b320a0fe/volumes" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.332835 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.493366 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkmwl\" (UniqueName: \"kubernetes.io/projected/9a7d6590-bf03-479a-a094-259dd4efafef-kube-api-access-xkmwl\") pod \"9a7d6590-bf03-479a-a094-259dd4efafef\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.493545 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-inventory\") pod \"9a7d6590-bf03-479a-a094-259dd4efafef\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.493645 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-ssh-key\") pod \"9a7d6590-bf03-479a-a094-259dd4efafef\" (UID: \"9a7d6590-bf03-479a-a094-259dd4efafef\") " Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.527482 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a7d6590-bf03-479a-a094-259dd4efafef-kube-api-access-xkmwl" (OuterVolumeSpecName: "kube-api-access-xkmwl") pod "9a7d6590-bf03-479a-a094-259dd4efafef" (UID: "9a7d6590-bf03-479a-a094-259dd4efafef"). InnerVolumeSpecName "kube-api-access-xkmwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.532216 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-inventory" (OuterVolumeSpecName: "inventory") pod "9a7d6590-bf03-479a-a094-259dd4efafef" (UID: "9a7d6590-bf03-479a-a094-259dd4efafef"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.534501 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9a7d6590-bf03-479a-a094-259dd4efafef" (UID: "9a7d6590-bf03-479a-a094-259dd4efafef"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.596631 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.596717 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9a7d6590-bf03-479a-a094-259dd4efafef-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.596735 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkmwl\" (UniqueName: \"kubernetes.io/projected/9a7d6590-bf03-479a-a094-259dd4efafef-kube-api-access-xkmwl\") on node \"crc\" DevicePath \"\"" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.855824 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" event={"ID":"9a7d6590-bf03-479a-a094-259dd4efafef","Type":"ContainerDied","Data":"2d95ad1af7a97019b9312bae20582daa8d2d369a1b3662b25b9d5c802421dd81"} Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.855883 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d95ad1af7a97019b9312bae20582daa8d2d369a1b3662b25b9d5c802421dd81" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.856480 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8drs6" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.972622 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2"] Dec 17 09:34:38 crc kubenswrapper[4935]: E1217 09:34:38.973349 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a7d6590-bf03-479a-a094-259dd4efafef" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.973377 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a7d6590-bf03-479a-a094-259dd4efafef" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.973733 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a7d6590-bf03-479a-a094-259dd4efafef" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.974896 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.978040 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.978322 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.979318 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.979434 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:34:38 crc kubenswrapper[4935]: I1217 09:34:38.985617 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2"] Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.005415 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fh5r\" (UniqueName: \"kubernetes.io/projected/a4733665-b253-4afc-b8a3-3028f3fb2892-kube-api-access-7fh5r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-48fh2\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.005499 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-48fh2\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.005605 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-48fh2\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.106939 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-48fh2\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.107067 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fh5r\" (UniqueName: \"kubernetes.io/projected/a4733665-b253-4afc-b8a3-3028f3fb2892-kube-api-access-7fh5r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-48fh2\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.107313 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-48fh2\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.113824 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-48fh2\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.113950 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-48fh2\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.134170 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fh5r\" (UniqueName: \"kubernetes.io/projected/a4733665-b253-4afc-b8a3-3028f3fb2892-kube-api-access-7fh5r\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-48fh2\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.293041 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:34:39 crc kubenswrapper[4935]: I1217 09:34:39.881916 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2"] Dec 17 09:34:40 crc kubenswrapper[4935]: I1217 09:34:40.879303 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" event={"ID":"a4733665-b253-4afc-b8a3-3028f3fb2892","Type":"ContainerStarted","Data":"80bede412335ecb2f74712c43c612f5951c21efdcd945267d64e7a860f0ce2cc"} Dec 17 09:34:41 crc kubenswrapper[4935]: I1217 09:34:41.889166 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" event={"ID":"a4733665-b253-4afc-b8a3-3028f3fb2892","Type":"ContainerStarted","Data":"6d0afff619b079ca621dbe22c18d3bfbd4cdc5a3756d8f9cfb1399d3d313120b"} Dec 17 09:34:41 crc kubenswrapper[4935]: I1217 09:34:41.908310 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" podStartSLOduration=2.97822933 podStartE2EDuration="3.908267712s" podCreationTimestamp="2025-12-17 09:34:38 +0000 UTC" firstStartedPulling="2025-12-17 09:34:39.881637624 +0000 UTC m=+1799.541478387" lastFinishedPulling="2025-12-17 09:34:40.811676006 +0000 UTC m=+1800.471516769" observedRunningTime="2025-12-17 09:34:41.907217807 +0000 UTC m=+1801.567058580" watchObservedRunningTime="2025-12-17 09:34:41.908267712 +0000 UTC m=+1801.568108475" Dec 17 09:34:43 crc kubenswrapper[4935]: I1217 09:34:43.855638 4935 scope.go:117] "RemoveContainer" containerID="db1f7c1e0b5bcee64b60dccb81d4b73c80fe92ee9a37075949034bb6d429ce1b" Dec 17 09:34:43 crc kubenswrapper[4935]: I1217 09:34:43.878661 4935 scope.go:117] "RemoveContainer" containerID="5d51f5eb1cd3f3515acbdb9ece5eb9c266c47c43dcda7661a8f14162908c2979" Dec 17 09:34:43 crc kubenswrapper[4935]: I1217 09:34:43.927536 4935 scope.go:117] "RemoveContainer" containerID="fef2d06d27c63d666220b1ee7bf5e444341ac9e38fdb62d31d0790e181779c06" Dec 17 09:34:44 crc kubenswrapper[4935]: I1217 09:34:44.010202 4935 scope.go:117] "RemoveContainer" containerID="89b2b722e268666b221053c8920c32f79cc1f3d4cf582f40887173888baf4949" Dec 17 09:34:44 crc kubenswrapper[4935]: I1217 09:34:44.032255 4935 scope.go:117] "RemoveContainer" containerID="1e8eddfda05e83a1c6823a74e4e06b33e6d2ee020da47df9a5df12ef153c46e8" Dec 17 09:34:44 crc kubenswrapper[4935]: I1217 09:34:44.078084 4935 scope.go:117] "RemoveContainer" containerID="736c36d5b0a3d1dfb37532adde6f6283ab1b4efaee9daf103245b47e311c24b6" Dec 17 09:34:44 crc kubenswrapper[4935]: I1217 09:34:44.134950 4935 scope.go:117] "RemoveContainer" containerID="5c0545040d4c0d7a0defca6036355ee3fc839a7a9ac8518f0c60bdb0297de245" Dec 17 09:34:46 crc kubenswrapper[4935]: I1217 09:34:46.037876 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-rvtvc"] Dec 17 09:34:46 crc kubenswrapper[4935]: I1217 09:34:46.051504 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-rvtvc"] Dec 17 09:34:46 crc kubenswrapper[4935]: I1217 09:34:46.062953 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xk2l5"] Dec 17 09:34:46 crc kubenswrapper[4935]: I1217 09:34:46.072035 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xk2l5"] Dec 17 09:34:46 crc kubenswrapper[4935]: I1217 09:34:46.125083 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:34:46 crc kubenswrapper[4935]: E1217 09:34:46.125462 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:34:47 crc kubenswrapper[4935]: I1217 09:34:47.135593 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b52e650-9c70-4617-9fbb-12fbb5a1c3e0" path="/var/lib/kubelet/pods/0b52e650-9c70-4617-9fbb-12fbb5a1c3e0/volumes" Dec 17 09:34:47 crc kubenswrapper[4935]: I1217 09:34:47.136785 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b867f99b-0bea-4d24-88e7-4dc1c1f991e6" path="/var/lib/kubelet/pods/b867f99b-0bea-4d24-88e7-4dc1c1f991e6/volumes" Dec 17 09:34:59 crc kubenswrapper[4935]: I1217 09:34:59.033862 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-vhjl6"] Dec 17 09:34:59 crc kubenswrapper[4935]: I1217 09:34:59.045966 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-vhjl6"] Dec 17 09:34:59 crc kubenswrapper[4935]: I1217 09:34:59.136034 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a62d0f30-735b-410e-ac80-50a98636ff47" path="/var/lib/kubelet/pods/a62d0f30-735b-410e-ac80-50a98636ff47/volumes" Dec 17 09:35:00 crc kubenswrapper[4935]: I1217 09:35:00.124188 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:35:00 crc kubenswrapper[4935]: E1217 09:35:00.124690 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:35:07 crc kubenswrapper[4935]: I1217 09:35:07.037134 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-f7nmq"] Dec 17 09:35:07 crc kubenswrapper[4935]: I1217 09:35:07.048010 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-f7nmq"] Dec 17 09:35:07 crc kubenswrapper[4935]: I1217 09:35:07.141894 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b17c8be-6039-4aa6-8227-cd2dfc076f77" path="/var/lib/kubelet/pods/9b17c8be-6039-4aa6-8227-cd2dfc076f77/volumes" Dec 17 09:35:13 crc kubenswrapper[4935]: I1217 09:35:13.124925 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:35:13 crc kubenswrapper[4935]: E1217 09:35:13.125740 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:35:28 crc kubenswrapper[4935]: I1217 09:35:28.124671 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:35:28 crc kubenswrapper[4935]: E1217 09:35:28.126405 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:35:41 crc kubenswrapper[4935]: I1217 09:35:41.040753 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8905-account-create-update-d7k5w"] Dec 17 09:35:41 crc kubenswrapper[4935]: I1217 09:35:41.048879 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8905-account-create-update-d7k5w"] Dec 17 09:35:41 crc kubenswrapper[4935]: I1217 09:35:41.135533 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3" path="/var/lib/kubelet/pods/7ee5cb9e-8cb2-4b3c-b71f-69cf98c7cdd3/volumes" Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.064212 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-k26rm"] Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.078039 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-36ba-account-create-update-vkfrb"] Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.087966 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-fc5c-account-create-update-27xgk"] Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.096741 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-k26rm"] Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.105531 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-36ba-account-create-update-vkfrb"] Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.113431 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tf769"] Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.120715 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-fc5c-account-create-update-27xgk"] Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.124227 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:35:42 crc kubenswrapper[4935]: E1217 09:35:42.124827 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.128047 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tf769"] Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.136502 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-ttcj9"] Dec 17 09:35:42 crc kubenswrapper[4935]: I1217 09:35:42.146320 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-ttcj9"] Dec 17 09:35:43 crc kubenswrapper[4935]: I1217 09:35:43.140300 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abecb40-40b9-4579-949a-40de63a9f65c" path="/var/lib/kubelet/pods/1abecb40-40b9-4579-949a-40de63a9f65c/volumes" Dec 17 09:35:43 crc kubenswrapper[4935]: I1217 09:35:43.140927 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47357892-adb6-48a7-99e9-b464b90d60db" path="/var/lib/kubelet/pods/47357892-adb6-48a7-99e9-b464b90d60db/volumes" Dec 17 09:35:43 crc kubenswrapper[4935]: I1217 09:35:43.141621 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="987fbded-c214-4ec4-8a67-3c79ded79782" path="/var/lib/kubelet/pods/987fbded-c214-4ec4-8a67-3c79ded79782/volumes" Dec 17 09:35:43 crc kubenswrapper[4935]: I1217 09:35:43.142213 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3a20517-d9a8-4082-bd71-51790e3e4e89" path="/var/lib/kubelet/pods/a3a20517-d9a8-4082-bd71-51790e3e4e89/volumes" Dec 17 09:35:43 crc kubenswrapper[4935]: I1217 09:35:43.143450 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4e49d6-e019-40a7-873b-89956a1bf3c9" path="/var/lib/kubelet/pods/be4e49d6-e019-40a7-873b-89956a1bf3c9/volumes" Dec 17 09:35:44 crc kubenswrapper[4935]: I1217 09:35:44.310030 4935 scope.go:117] "RemoveContainer" containerID="42c8ce9d58026ac8daf60247c2367152c2c018b6ea58efe28a489e158d919d97" Dec 17 09:35:44 crc kubenswrapper[4935]: I1217 09:35:44.366133 4935 scope.go:117] "RemoveContainer" containerID="df23db94be5d90dbc57efa70742d40f9309536de3051e7e27800233e6f978215" Dec 17 09:35:44 crc kubenswrapper[4935]: I1217 09:35:44.409945 4935 scope.go:117] "RemoveContainer" containerID="dd4dfe6f58e97542bc6a1d0c20e379cc5ce81d4d175be957003516b8b29d3c17" Dec 17 09:35:44 crc kubenswrapper[4935]: I1217 09:35:44.448413 4935 scope.go:117] "RemoveContainer" containerID="9f29fe5d93c405192589ace7b8e38b12b14a17c2766dff63747094e4a9a117ce" Dec 17 09:35:44 crc kubenswrapper[4935]: I1217 09:35:44.518902 4935 scope.go:117] "RemoveContainer" containerID="99d2e1e9d43e2b0a5452b48ccbdb667d894a00115adaa79add2f82ae5064307e" Dec 17 09:35:44 crc kubenswrapper[4935]: I1217 09:35:44.554218 4935 scope.go:117] "RemoveContainer" containerID="429cba53824dc60789d46872c8f1501dab831ba522d949c4155b2d8dfb4293dc" Dec 17 09:35:44 crc kubenswrapper[4935]: I1217 09:35:44.586522 4935 scope.go:117] "RemoveContainer" containerID="3ea148c6b6a672fe1cdb81351986afd65cd19564e042063d59a6a01552c2d554" Dec 17 09:35:44 crc kubenswrapper[4935]: I1217 09:35:44.615649 4935 scope.go:117] "RemoveContainer" containerID="3b465b3f36b90a6aabc5fa090669a5cd9addbc15694b0ee9bb50dbc35ae9cd5d" Dec 17 09:35:44 crc kubenswrapper[4935]: I1217 09:35:44.637809 4935 scope.go:117] "RemoveContainer" containerID="baaf176e20cd084e202c11bb76c0a427e4171584299baafeb86a1f1bc4570c23" Dec 17 09:35:44 crc kubenswrapper[4935]: I1217 09:35:44.658949 4935 scope.go:117] "RemoveContainer" containerID="6595c21fc067628d82349f91738809e6a18571ba8dd3b2825dc2e2abff0cadf6" Dec 17 09:35:52 crc kubenswrapper[4935]: I1217 09:35:52.551322 4935 generic.go:334] "Generic (PLEG): container finished" podID="a4733665-b253-4afc-b8a3-3028f3fb2892" containerID="6d0afff619b079ca621dbe22c18d3bfbd4cdc5a3756d8f9cfb1399d3d313120b" exitCode=0 Dec 17 09:35:52 crc kubenswrapper[4935]: I1217 09:35:52.551445 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" event={"ID":"a4733665-b253-4afc-b8a3-3028f3fb2892","Type":"ContainerDied","Data":"6d0afff619b079ca621dbe22c18d3bfbd4cdc5a3756d8f9cfb1399d3d313120b"} Dec 17 09:35:53 crc kubenswrapper[4935]: I1217 09:35:53.973288 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.161965 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-ssh-key\") pod \"a4733665-b253-4afc-b8a3-3028f3fb2892\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.162038 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-inventory\") pod \"a4733665-b253-4afc-b8a3-3028f3fb2892\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.162220 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fh5r\" (UniqueName: \"kubernetes.io/projected/a4733665-b253-4afc-b8a3-3028f3fb2892-kube-api-access-7fh5r\") pod \"a4733665-b253-4afc-b8a3-3028f3fb2892\" (UID: \"a4733665-b253-4afc-b8a3-3028f3fb2892\") " Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.168116 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4733665-b253-4afc-b8a3-3028f3fb2892-kube-api-access-7fh5r" (OuterVolumeSpecName: "kube-api-access-7fh5r") pod "a4733665-b253-4afc-b8a3-3028f3fb2892" (UID: "a4733665-b253-4afc-b8a3-3028f3fb2892"). InnerVolumeSpecName "kube-api-access-7fh5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.191687 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-inventory" (OuterVolumeSpecName: "inventory") pod "a4733665-b253-4afc-b8a3-3028f3fb2892" (UID: "a4733665-b253-4afc-b8a3-3028f3fb2892"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.193315 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a4733665-b253-4afc-b8a3-3028f3fb2892" (UID: "a4733665-b253-4afc-b8a3-3028f3fb2892"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.266181 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fh5r\" (UniqueName: \"kubernetes.io/projected/a4733665-b253-4afc-b8a3-3028f3fb2892-kube-api-access-7fh5r\") on node \"crc\" DevicePath \"\"" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.266255 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.266298 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a4733665-b253-4afc-b8a3-3028f3fb2892-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.572791 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" event={"ID":"a4733665-b253-4afc-b8a3-3028f3fb2892","Type":"ContainerDied","Data":"80bede412335ecb2f74712c43c612f5951c21efdcd945267d64e7a860f0ce2cc"} Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.572839 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80bede412335ecb2f74712c43c612f5951c21efdcd945267d64e7a860f0ce2cc" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.572848 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-48fh2" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.683651 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt"] Dec 17 09:35:54 crc kubenswrapper[4935]: E1217 09:35:54.684106 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4733665-b253-4afc-b8a3-3028f3fb2892" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.684128 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4733665-b253-4afc-b8a3-3028f3fb2892" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.684469 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4733665-b253-4afc-b8a3-3028f3fb2892" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.685446 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.689196 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.689435 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.689544 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.692186 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.697294 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt"] Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.878379 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmqtb\" (UniqueName: \"kubernetes.io/projected/2fabe8c0-2434-481a-9609-03c9ba3c30d5-kube-api-access-hmqtb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.878854 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.878986 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.980875 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.980933 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.980990 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmqtb\" (UniqueName: \"kubernetes.io/projected/2fabe8c0-2434-481a-9609-03c9ba3c30d5-kube-api-access-hmqtb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.985779 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:54 crc kubenswrapper[4935]: I1217 09:35:54.985918 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:55 crc kubenswrapper[4935]: I1217 09:35:55.001779 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmqtb\" (UniqueName: \"kubernetes.io/projected/2fabe8c0-2434-481a-9609-03c9ba3c30d5-kube-api-access-hmqtb\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:55 crc kubenswrapper[4935]: I1217 09:35:55.019635 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:35:55 crc kubenswrapper[4935]: I1217 09:35:55.529719 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt"] Dec 17 09:35:55 crc kubenswrapper[4935]: W1217 09:35:55.542219 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fabe8c0_2434_481a_9609_03c9ba3c30d5.slice/crio-9c61f7edf8c45c69aa4140858a5ba2afadd9adc78b01b3c83f8ab27a28ada8b5 WatchSource:0}: Error finding container 9c61f7edf8c45c69aa4140858a5ba2afadd9adc78b01b3c83f8ab27a28ada8b5: Status 404 returned error can't find the container with id 9c61f7edf8c45c69aa4140858a5ba2afadd9adc78b01b3c83f8ab27a28ada8b5 Dec 17 09:35:55 crc kubenswrapper[4935]: I1217 09:35:55.586482 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" event={"ID":"2fabe8c0-2434-481a-9609-03c9ba3c30d5","Type":"ContainerStarted","Data":"9c61f7edf8c45c69aa4140858a5ba2afadd9adc78b01b3c83f8ab27a28ada8b5"} Dec 17 09:35:56 crc kubenswrapper[4935]: I1217 09:35:56.601671 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" event={"ID":"2fabe8c0-2434-481a-9609-03c9ba3c30d5","Type":"ContainerStarted","Data":"8dc9ea9169bbb27b4cbdf9395e99ba81738bd6d1beb568fc3a4510cddbf5a1df"} Dec 17 09:35:56 crc kubenswrapper[4935]: I1217 09:35:56.630381 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" podStartSLOduration=2.068464003 podStartE2EDuration="2.630338295s" podCreationTimestamp="2025-12-17 09:35:54 +0000 UTC" firstStartedPulling="2025-12-17 09:35:55.548600837 +0000 UTC m=+1875.208441600" lastFinishedPulling="2025-12-17 09:35:56.110475129 +0000 UTC m=+1875.770315892" observedRunningTime="2025-12-17 09:35:56.621646645 +0000 UTC m=+1876.281487418" watchObservedRunningTime="2025-12-17 09:35:56.630338295 +0000 UTC m=+1876.290179068" Dec 17 09:35:57 crc kubenswrapper[4935]: I1217 09:35:57.124881 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:35:57 crc kubenswrapper[4935]: E1217 09:35:57.125583 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:36:01 crc kubenswrapper[4935]: I1217 09:36:01.646477 4935 generic.go:334] "Generic (PLEG): container finished" podID="2fabe8c0-2434-481a-9609-03c9ba3c30d5" containerID="8dc9ea9169bbb27b4cbdf9395e99ba81738bd6d1beb568fc3a4510cddbf5a1df" exitCode=0 Dec 17 09:36:01 crc kubenswrapper[4935]: I1217 09:36:01.646549 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" event={"ID":"2fabe8c0-2434-481a-9609-03c9ba3c30d5","Type":"ContainerDied","Data":"8dc9ea9169bbb27b4cbdf9395e99ba81738bd6d1beb568fc3a4510cddbf5a1df"} Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.054194 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.154319 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmqtb\" (UniqueName: \"kubernetes.io/projected/2fabe8c0-2434-481a-9609-03c9ba3c30d5-kube-api-access-hmqtb\") pod \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.154378 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-ssh-key\") pod \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.154410 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-inventory\") pod \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\" (UID: \"2fabe8c0-2434-481a-9609-03c9ba3c30d5\") " Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.162106 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fabe8c0-2434-481a-9609-03c9ba3c30d5-kube-api-access-hmqtb" (OuterVolumeSpecName: "kube-api-access-hmqtb") pod "2fabe8c0-2434-481a-9609-03c9ba3c30d5" (UID: "2fabe8c0-2434-481a-9609-03c9ba3c30d5"). InnerVolumeSpecName "kube-api-access-hmqtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.183777 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-inventory" (OuterVolumeSpecName: "inventory") pod "2fabe8c0-2434-481a-9609-03c9ba3c30d5" (UID: "2fabe8c0-2434-481a-9609-03c9ba3c30d5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.192413 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2fabe8c0-2434-481a-9609-03c9ba3c30d5" (UID: "2fabe8c0-2434-481a-9609-03c9ba3c30d5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.258795 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmqtb\" (UniqueName: \"kubernetes.io/projected/2fabe8c0-2434-481a-9609-03c9ba3c30d5-kube-api-access-hmqtb\") on node \"crc\" DevicePath \"\"" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.258992 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.259003 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2fabe8c0-2434-481a-9609-03c9ba3c30d5-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.668443 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" event={"ID":"2fabe8c0-2434-481a-9609-03c9ba3c30d5","Type":"ContainerDied","Data":"9c61f7edf8c45c69aa4140858a5ba2afadd9adc78b01b3c83f8ab27a28ada8b5"} Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.668902 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c61f7edf8c45c69aa4140858a5ba2afadd9adc78b01b3c83f8ab27a28ada8b5" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.668491 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.738965 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4"] Dec 17 09:36:03 crc kubenswrapper[4935]: E1217 09:36:03.739664 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fabe8c0-2434-481a-9609-03c9ba3c30d5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.739686 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fabe8c0-2434-481a-9609-03c9ba3c30d5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.739927 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fabe8c0-2434-481a-9609-03c9ba3c30d5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.740707 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.745695 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.745695 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.745954 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.747719 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.758875 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4"] Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.875387 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxxbw\" (UniqueName: \"kubernetes.io/projected/a3a33180-b0e2-45f0-bcda-e3c49acfac29-kube-api-access-cxxbw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9h4\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.875829 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9h4\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.876023 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9h4\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.977836 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9h4\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.978342 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9h4\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.978726 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxxbw\" (UniqueName: \"kubernetes.io/projected/a3a33180-b0e2-45f0-bcda-e3c49acfac29-kube-api-access-cxxbw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9h4\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.983517 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9h4\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.983567 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9h4\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:03 crc kubenswrapper[4935]: I1217 09:36:03.997818 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxxbw\" (UniqueName: \"kubernetes.io/projected/a3a33180-b0e2-45f0-bcda-e3c49acfac29-kube-api-access-cxxbw\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-fk9h4\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:04 crc kubenswrapper[4935]: I1217 09:36:04.073377 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:04 crc kubenswrapper[4935]: I1217 09:36:04.591593 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4"] Dec 17 09:36:04 crc kubenswrapper[4935]: I1217 09:36:04.680066 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" event={"ID":"a3a33180-b0e2-45f0-bcda-e3c49acfac29","Type":"ContainerStarted","Data":"1137afdf38b86673aae37f0e31a72a0ba048e35908dedccb2d622a732c49ad77"} Dec 17 09:36:05 crc kubenswrapper[4935]: I1217 09:36:05.691628 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" event={"ID":"a3a33180-b0e2-45f0-bcda-e3c49acfac29","Type":"ContainerStarted","Data":"ccb407e786241e476e5bdbf45099b538d3925c613ffa51beacedca2107142a90"} Dec 17 09:36:05 crc kubenswrapper[4935]: I1217 09:36:05.717370 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" podStartSLOduration=2.209595144 podStartE2EDuration="2.717346497s" podCreationTimestamp="2025-12-17 09:36:03 +0000 UTC" firstStartedPulling="2025-12-17 09:36:04.59848237 +0000 UTC m=+1884.258323133" lastFinishedPulling="2025-12-17 09:36:05.106233723 +0000 UTC m=+1884.766074486" observedRunningTime="2025-12-17 09:36:05.712831108 +0000 UTC m=+1885.372671871" watchObservedRunningTime="2025-12-17 09:36:05.717346497 +0000 UTC m=+1885.377187260" Dec 17 09:36:09 crc kubenswrapper[4935]: I1217 09:36:09.045587 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hk94m"] Dec 17 09:36:09 crc kubenswrapper[4935]: I1217 09:36:09.055672 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-hk94m"] Dec 17 09:36:09 crc kubenswrapper[4935]: I1217 09:36:09.138187 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9add23c-5f09-479c-94f8-4e4c35af6dde" path="/var/lib/kubelet/pods/e9add23c-5f09-479c-94f8-4e4c35af6dde/volumes" Dec 17 09:36:10 crc kubenswrapper[4935]: I1217 09:36:10.124320 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:36:10 crc kubenswrapper[4935]: I1217 09:36:10.780621 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"a83a24692330d7443f7f4ba2368305dece4f6efd7ec6cd4d7ebd5ce65b90336c"} Dec 17 09:36:34 crc kubenswrapper[4935]: I1217 09:36:34.063476 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xbvz5"] Dec 17 09:36:34 crc kubenswrapper[4935]: I1217 09:36:34.073324 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xb85g"] Dec 17 09:36:34 crc kubenswrapper[4935]: I1217 09:36:34.084678 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xbvz5"] Dec 17 09:36:34 crc kubenswrapper[4935]: I1217 09:36:34.092796 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xb85g"] Dec 17 09:36:35 crc kubenswrapper[4935]: I1217 09:36:35.135859 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6288f109-f80f-4cd2-a928-914d30835d20" path="/var/lib/kubelet/pods/6288f109-f80f-4cd2-a928-914d30835d20/volumes" Dec 17 09:36:35 crc kubenswrapper[4935]: I1217 09:36:35.136834 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06" path="/var/lib/kubelet/pods/a919ea7b-ffe9-4e9b-9dd9-7f5dc6a6cf06/volumes" Dec 17 09:36:44 crc kubenswrapper[4935]: I1217 09:36:44.844307 4935 scope.go:117] "RemoveContainer" containerID="94a2119c0b9f91dc3fdd481b82c2852972ff9c3e055a8ccd82efc3f67a18b281" Dec 17 09:36:44 crc kubenswrapper[4935]: I1217 09:36:44.897521 4935 scope.go:117] "RemoveContainer" containerID="324a498ea9a803deea163bfc5378483bbb68558bc64874d4b0ec42f82ff2b312" Dec 17 09:36:44 crc kubenswrapper[4935]: I1217 09:36:44.958527 4935 scope.go:117] "RemoveContainer" containerID="6da47cc7beed25361c935a9c4b27ad95a0c65cf8a0ff3a41da9dc1d4f70418a7" Dec 17 09:36:45 crc kubenswrapper[4935]: I1217 09:36:45.148391 4935 generic.go:334] "Generic (PLEG): container finished" podID="a3a33180-b0e2-45f0-bcda-e3c49acfac29" containerID="ccb407e786241e476e5bdbf45099b538d3925c613ffa51beacedca2107142a90" exitCode=0 Dec 17 09:36:45 crc kubenswrapper[4935]: I1217 09:36:45.148494 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" event={"ID":"a3a33180-b0e2-45f0-bcda-e3c49acfac29","Type":"ContainerDied","Data":"ccb407e786241e476e5bdbf45099b538d3925c613ffa51beacedca2107142a90"} Dec 17 09:36:46 crc kubenswrapper[4935]: I1217 09:36:46.593451 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:46 crc kubenswrapper[4935]: I1217 09:36:46.770304 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-ssh-key\") pod \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " Dec 17 09:36:46 crc kubenswrapper[4935]: I1217 09:36:46.770839 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-inventory\") pod \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " Dec 17 09:36:46 crc kubenswrapper[4935]: I1217 09:36:46.770962 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxxbw\" (UniqueName: \"kubernetes.io/projected/a3a33180-b0e2-45f0-bcda-e3c49acfac29-kube-api-access-cxxbw\") pod \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\" (UID: \"a3a33180-b0e2-45f0-bcda-e3c49acfac29\") " Dec 17 09:36:46 crc kubenswrapper[4935]: I1217 09:36:46.783486 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3a33180-b0e2-45f0-bcda-e3c49acfac29-kube-api-access-cxxbw" (OuterVolumeSpecName: "kube-api-access-cxxbw") pod "a3a33180-b0e2-45f0-bcda-e3c49acfac29" (UID: "a3a33180-b0e2-45f0-bcda-e3c49acfac29"). InnerVolumeSpecName "kube-api-access-cxxbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:36:46 crc kubenswrapper[4935]: I1217 09:36:46.819475 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a3a33180-b0e2-45f0-bcda-e3c49acfac29" (UID: "a3a33180-b0e2-45f0-bcda-e3c49acfac29"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:36:46 crc kubenswrapper[4935]: I1217 09:36:46.826963 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-inventory" (OuterVolumeSpecName: "inventory") pod "a3a33180-b0e2-45f0-bcda-e3c49acfac29" (UID: "a3a33180-b0e2-45f0-bcda-e3c49acfac29"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:36:46 crc kubenswrapper[4935]: I1217 09:36:46.873951 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:36:46 crc kubenswrapper[4935]: I1217 09:36:46.873996 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a3a33180-b0e2-45f0-bcda-e3c49acfac29-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:36:46 crc kubenswrapper[4935]: I1217 09:36:46.874007 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxxbw\" (UniqueName: \"kubernetes.io/projected/a3a33180-b0e2-45f0-bcda-e3c49acfac29-kube-api-access-cxxbw\") on node \"crc\" DevicePath \"\"" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.170704 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" event={"ID":"a3a33180-b0e2-45f0-bcda-e3c49acfac29","Type":"ContainerDied","Data":"1137afdf38b86673aae37f0e31a72a0ba048e35908dedccb2d622a732c49ad77"} Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.170749 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1137afdf38b86673aae37f0e31a72a0ba048e35908dedccb2d622a732c49ad77" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.170762 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-fk9h4" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.409550 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww"] Dec 17 09:36:47 crc kubenswrapper[4935]: E1217 09:36:47.410727 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3a33180-b0e2-45f0-bcda-e3c49acfac29" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.410751 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a33180-b0e2-45f0-bcda-e3c49acfac29" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.410955 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3a33180-b0e2-45f0-bcda-e3c49acfac29" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.412722 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.416256 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.417404 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.418065 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.419003 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.426528 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww"] Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.588834 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kwrww\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.589513 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr44n\" (UniqueName: \"kubernetes.io/projected/dd4964d0-85a5-474f-a8f5-084210467887-kube-api-access-lr44n\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kwrww\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.589564 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kwrww\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.691896 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kwrww\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.691943 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr44n\" (UniqueName: \"kubernetes.io/projected/dd4964d0-85a5-474f-a8f5-084210467887-kube-api-access-lr44n\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kwrww\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.691975 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kwrww\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.700675 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kwrww\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.703544 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kwrww\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.715421 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr44n\" (UniqueName: \"kubernetes.io/projected/dd4964d0-85a5-474f-a8f5-084210467887-kube-api-access-lr44n\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-kwrww\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:47 crc kubenswrapper[4935]: I1217 09:36:47.731924 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:36:48 crc kubenswrapper[4935]: I1217 09:36:48.334816 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww"] Dec 17 09:36:49 crc kubenswrapper[4935]: I1217 09:36:49.194049 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" event={"ID":"dd4964d0-85a5-474f-a8f5-084210467887","Type":"ContainerStarted","Data":"9905a9acd60221a83116f540eea694b96f54e89b23597eb59838f7ef3a3a9279"} Dec 17 09:36:49 crc kubenswrapper[4935]: I1217 09:36:49.194717 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" event={"ID":"dd4964d0-85a5-474f-a8f5-084210467887","Type":"ContainerStarted","Data":"3687f15c9e09b805d9c431e1f295444db28a257ef1061b449858826aecac7bc8"} Dec 17 09:36:49 crc kubenswrapper[4935]: I1217 09:36:49.215518 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" podStartSLOduration=1.67022968 podStartE2EDuration="2.215492921s" podCreationTimestamp="2025-12-17 09:36:47 +0000 UTC" firstStartedPulling="2025-12-17 09:36:48.330623588 +0000 UTC m=+1927.990464351" lastFinishedPulling="2025-12-17 09:36:48.875886829 +0000 UTC m=+1928.535727592" observedRunningTime="2025-12-17 09:36:49.212910378 +0000 UTC m=+1928.872751181" watchObservedRunningTime="2025-12-17 09:36:49.215492921 +0000 UTC m=+1928.875333684" Dec 17 09:37:19 crc kubenswrapper[4935]: I1217 09:37:19.049688 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-slw7z"] Dec 17 09:37:19 crc kubenswrapper[4935]: I1217 09:37:19.059513 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-slw7z"] Dec 17 09:37:19 crc kubenswrapper[4935]: I1217 09:37:19.136058 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5230b826-60b1-4db8-a7a0-e63c356fcc72" path="/var/lib/kubelet/pods/5230b826-60b1-4db8-a7a0-e63c356fcc72/volumes" Dec 17 09:37:41 crc kubenswrapper[4935]: I1217 09:37:41.667513 4935 generic.go:334] "Generic (PLEG): container finished" podID="dd4964d0-85a5-474f-a8f5-084210467887" containerID="9905a9acd60221a83116f540eea694b96f54e89b23597eb59838f7ef3a3a9279" exitCode=0 Dec 17 09:37:41 crc kubenswrapper[4935]: I1217 09:37:41.667617 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" event={"ID":"dd4964d0-85a5-474f-a8f5-084210467887","Type":"ContainerDied","Data":"9905a9acd60221a83116f540eea694b96f54e89b23597eb59838f7ef3a3a9279"} Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.133460 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.177025 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr44n\" (UniqueName: \"kubernetes.io/projected/dd4964d0-85a5-474f-a8f5-084210467887-kube-api-access-lr44n\") pod \"dd4964d0-85a5-474f-a8f5-084210467887\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.177240 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-ssh-key\") pod \"dd4964d0-85a5-474f-a8f5-084210467887\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.177405 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-inventory\") pod \"dd4964d0-85a5-474f-a8f5-084210467887\" (UID: \"dd4964d0-85a5-474f-a8f5-084210467887\") " Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.184072 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4964d0-85a5-474f-a8f5-084210467887-kube-api-access-lr44n" (OuterVolumeSpecName: "kube-api-access-lr44n") pod "dd4964d0-85a5-474f-a8f5-084210467887" (UID: "dd4964d0-85a5-474f-a8f5-084210467887"). InnerVolumeSpecName "kube-api-access-lr44n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.212900 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-inventory" (OuterVolumeSpecName: "inventory") pod "dd4964d0-85a5-474f-a8f5-084210467887" (UID: "dd4964d0-85a5-474f-a8f5-084210467887"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.232878 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dd4964d0-85a5-474f-a8f5-084210467887" (UID: "dd4964d0-85a5-474f-a8f5-084210467887"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.281124 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.281162 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dd4964d0-85a5-474f-a8f5-084210467887-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.281172 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr44n\" (UniqueName: \"kubernetes.io/projected/dd4964d0-85a5-474f-a8f5-084210467887-kube-api-access-lr44n\") on node \"crc\" DevicePath \"\"" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.694507 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" event={"ID":"dd4964d0-85a5-474f-a8f5-084210467887","Type":"ContainerDied","Data":"3687f15c9e09b805d9c431e1f295444db28a257ef1061b449858826aecac7bc8"} Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.694851 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3687f15c9e09b805d9c431e1f295444db28a257ef1061b449858826aecac7bc8" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.694750 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-kwrww" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.816763 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d6l7r"] Dec 17 09:37:43 crc kubenswrapper[4935]: E1217 09:37:43.817391 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4964d0-85a5-474f-a8f5-084210467887" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.817417 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4964d0-85a5-474f-a8f5-084210467887" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.817606 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4964d0-85a5-474f-a8f5-084210467887" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.818354 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.821332 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.821497 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.821827 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.821915 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.832917 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d6l7r"] Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.896317 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d6l7r\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.896519 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffx66\" (UniqueName: \"kubernetes.io/projected/aa726742-b847-49f9-8c0b-5814e42e1c66-kube-api-access-ffx66\") pod \"ssh-known-hosts-edpm-deployment-d6l7r\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.896570 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d6l7r\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.998734 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d6l7r\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.998898 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffx66\" (UniqueName: \"kubernetes.io/projected/aa726742-b847-49f9-8c0b-5814e42e1c66-kube-api-access-ffx66\") pod \"ssh-known-hosts-edpm-deployment-d6l7r\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:43 crc kubenswrapper[4935]: I1217 09:37:43.998930 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d6l7r\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:44 crc kubenswrapper[4935]: I1217 09:37:44.005309 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-d6l7r\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:44 crc kubenswrapper[4935]: I1217 09:37:44.005787 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-d6l7r\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:44 crc kubenswrapper[4935]: I1217 09:37:44.020594 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffx66\" (UniqueName: \"kubernetes.io/projected/aa726742-b847-49f9-8c0b-5814e42e1c66-kube-api-access-ffx66\") pod \"ssh-known-hosts-edpm-deployment-d6l7r\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:44 crc kubenswrapper[4935]: I1217 09:37:44.156628 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:44 crc kubenswrapper[4935]: I1217 09:37:44.857266 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-d6l7r"] Dec 17 09:37:45 crc kubenswrapper[4935]: I1217 09:37:45.071809 4935 scope.go:117] "RemoveContainer" containerID="67f67f65ca0a4cd9cd9b72e4c1a34dc08ede7bee29601513cb8de09b8358403e" Dec 17 09:37:45 crc kubenswrapper[4935]: I1217 09:37:45.719732 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" event={"ID":"aa726742-b847-49f9-8c0b-5814e42e1c66","Type":"ContainerStarted","Data":"4e6e34108afd4cdf57d11f70b70ab36b084420848383ca24e150dbe66af5adce"} Dec 17 09:37:45 crc kubenswrapper[4935]: I1217 09:37:45.720252 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" event={"ID":"aa726742-b847-49f9-8c0b-5814e42e1c66","Type":"ContainerStarted","Data":"1e80847d163d5cf8c089af41ffad0d0f8bc33df5553bd7689a729369639414f7"} Dec 17 09:37:45 crc kubenswrapper[4935]: I1217 09:37:45.742638 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" podStartSLOduration=2.118822178 podStartE2EDuration="2.742614349s" podCreationTimestamp="2025-12-17 09:37:43 +0000 UTC" firstStartedPulling="2025-12-17 09:37:44.865220468 +0000 UTC m=+1984.525061231" lastFinishedPulling="2025-12-17 09:37:45.489012639 +0000 UTC m=+1985.148853402" observedRunningTime="2025-12-17 09:37:45.733602651 +0000 UTC m=+1985.393443404" watchObservedRunningTime="2025-12-17 09:37:45.742614349 +0000 UTC m=+1985.402455112" Dec 17 09:37:53 crc kubenswrapper[4935]: I1217 09:37:53.790571 4935 generic.go:334] "Generic (PLEG): container finished" podID="aa726742-b847-49f9-8c0b-5814e42e1c66" containerID="4e6e34108afd4cdf57d11f70b70ab36b084420848383ca24e150dbe66af5adce" exitCode=0 Dec 17 09:37:53 crc kubenswrapper[4935]: I1217 09:37:53.790663 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" event={"ID":"aa726742-b847-49f9-8c0b-5814e42e1c66","Type":"ContainerDied","Data":"4e6e34108afd4cdf57d11f70b70ab36b084420848383ca24e150dbe66af5adce"} Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.212562 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.265537 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-inventory-0\") pod \"aa726742-b847-49f9-8c0b-5814e42e1c66\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.266106 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffx66\" (UniqueName: \"kubernetes.io/projected/aa726742-b847-49f9-8c0b-5814e42e1c66-kube-api-access-ffx66\") pod \"aa726742-b847-49f9-8c0b-5814e42e1c66\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.266204 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-ssh-key-openstack-edpm-ipam\") pod \"aa726742-b847-49f9-8c0b-5814e42e1c66\" (UID: \"aa726742-b847-49f9-8c0b-5814e42e1c66\") " Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.272830 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa726742-b847-49f9-8c0b-5814e42e1c66-kube-api-access-ffx66" (OuterVolumeSpecName: "kube-api-access-ffx66") pod "aa726742-b847-49f9-8c0b-5814e42e1c66" (UID: "aa726742-b847-49f9-8c0b-5814e42e1c66"). InnerVolumeSpecName "kube-api-access-ffx66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.300438 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "aa726742-b847-49f9-8c0b-5814e42e1c66" (UID: "aa726742-b847-49f9-8c0b-5814e42e1c66"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.300569 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "aa726742-b847-49f9-8c0b-5814e42e1c66" (UID: "aa726742-b847-49f9-8c0b-5814e42e1c66"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.369026 4935 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.369083 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffx66\" (UniqueName: \"kubernetes.io/projected/aa726742-b847-49f9-8c0b-5814e42e1c66-kube-api-access-ffx66\") on node \"crc\" DevicePath \"\"" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.369108 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/aa726742-b847-49f9-8c0b-5814e42e1c66-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.812090 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" event={"ID":"aa726742-b847-49f9-8c0b-5814e42e1c66","Type":"ContainerDied","Data":"1e80847d163d5cf8c089af41ffad0d0f8bc33df5553bd7689a729369639414f7"} Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.812144 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e80847d163d5cf8c089af41ffad0d0f8bc33df5553bd7689a729369639414f7" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.812226 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-d6l7r" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.903984 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz"] Dec 17 09:37:55 crc kubenswrapper[4935]: E1217 09:37:55.904465 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa726742-b847-49f9-8c0b-5814e42e1c66" containerName="ssh-known-hosts-edpm-deployment" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.904484 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa726742-b847-49f9-8c0b-5814e42e1c66" containerName="ssh-known-hosts-edpm-deployment" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.904667 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa726742-b847-49f9-8c0b-5814e42e1c66" containerName="ssh-known-hosts-edpm-deployment" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.905388 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.911347 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.911484 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.911484 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.912093 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.919232 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz"] Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.979602 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4stz\" (UniqueName: \"kubernetes.io/projected/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-kube-api-access-w4stz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8cgmz\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.979672 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8cgmz\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:55 crc kubenswrapper[4935]: I1217 09:37:55.979725 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8cgmz\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:56 crc kubenswrapper[4935]: I1217 09:37:56.082002 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4stz\" (UniqueName: \"kubernetes.io/projected/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-kube-api-access-w4stz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8cgmz\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:56 crc kubenswrapper[4935]: I1217 09:37:56.082098 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8cgmz\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:56 crc kubenswrapper[4935]: I1217 09:37:56.082159 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8cgmz\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:56 crc kubenswrapper[4935]: I1217 09:37:56.087800 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8cgmz\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:56 crc kubenswrapper[4935]: I1217 09:37:56.088455 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8cgmz\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:56 crc kubenswrapper[4935]: I1217 09:37:56.099528 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4stz\" (UniqueName: \"kubernetes.io/projected/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-kube-api-access-w4stz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-8cgmz\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:56 crc kubenswrapper[4935]: I1217 09:37:56.228298 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:37:56 crc kubenswrapper[4935]: I1217 09:37:56.727186 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz"] Dec 17 09:37:56 crc kubenswrapper[4935]: I1217 09:37:56.736012 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 09:37:56 crc kubenswrapper[4935]: I1217 09:37:56.820811 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" event={"ID":"c2da33ef-f139-4ce2-9ef3-2a15cefcf653","Type":"ContainerStarted","Data":"4a790bf2354ae143d3e885a13ac8f378758678e724332c0748e6ede32a18256f"} Dec 17 09:37:57 crc kubenswrapper[4935]: I1217 09:37:57.829827 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" event={"ID":"c2da33ef-f139-4ce2-9ef3-2a15cefcf653","Type":"ContainerStarted","Data":"264f881433ecdd2f3ca21042eb7c29e90857ad832298dbf9ea025222827205e1"} Dec 17 09:37:57 crc kubenswrapper[4935]: I1217 09:37:57.851140 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" podStartSLOduration=2.273765373 podStartE2EDuration="2.851118241s" podCreationTimestamp="2025-12-17 09:37:55 +0000 UTC" firstStartedPulling="2025-12-17 09:37:56.735816039 +0000 UTC m=+1996.395656812" lastFinishedPulling="2025-12-17 09:37:57.313168917 +0000 UTC m=+1996.973009680" observedRunningTime="2025-12-17 09:37:57.84570789 +0000 UTC m=+1997.505548663" watchObservedRunningTime="2025-12-17 09:37:57.851118241 +0000 UTC m=+1997.510959004" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.352322 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lp2rp"] Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.355613 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.364582 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lp2rp"] Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.530941 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-utilities\") pod \"redhat-operators-lp2rp\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.531213 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-catalog-content\") pod \"redhat-operators-lp2rp\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.531266 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckh8j\" (UniqueName: \"kubernetes.io/projected/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-kube-api-access-ckh8j\") pod \"redhat-operators-lp2rp\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.633290 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-catalog-content\") pod \"redhat-operators-lp2rp\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.633358 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckh8j\" (UniqueName: \"kubernetes.io/projected/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-kube-api-access-ckh8j\") pod \"redhat-operators-lp2rp\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.633419 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-utilities\") pod \"redhat-operators-lp2rp\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.633883 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-catalog-content\") pod \"redhat-operators-lp2rp\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.633923 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-utilities\") pod \"redhat-operators-lp2rp\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.657671 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckh8j\" (UniqueName: \"kubernetes.io/projected/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-kube-api-access-ckh8j\") pod \"redhat-operators-lp2rp\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:02 crc kubenswrapper[4935]: I1217 09:38:02.685720 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:03 crc kubenswrapper[4935]: I1217 09:38:03.152526 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lp2rp"] Dec 17 09:38:03 crc kubenswrapper[4935]: W1217 09:38:03.159132 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e3ea4c1_b683_4067_bdc5_a3ad8b4e8b8b.slice/crio-3d12ac5ddefd4ddab6b8360c921714adf3ac172cdf06a40ede41564db4f27e89 WatchSource:0}: Error finding container 3d12ac5ddefd4ddab6b8360c921714adf3ac172cdf06a40ede41564db4f27e89: Status 404 returned error can't find the container with id 3d12ac5ddefd4ddab6b8360c921714adf3ac172cdf06a40ede41564db4f27e89 Dec 17 09:38:03 crc kubenswrapper[4935]: I1217 09:38:03.955633 4935 generic.go:334] "Generic (PLEG): container finished" podID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerID="f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49" exitCode=0 Dec 17 09:38:03 crc kubenswrapper[4935]: I1217 09:38:03.955749 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp2rp" event={"ID":"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b","Type":"ContainerDied","Data":"f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49"} Dec 17 09:38:03 crc kubenswrapper[4935]: I1217 09:38:03.955945 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp2rp" event={"ID":"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b","Type":"ContainerStarted","Data":"3d12ac5ddefd4ddab6b8360c921714adf3ac172cdf06a40ede41564db4f27e89"} Dec 17 09:38:05 crc kubenswrapper[4935]: I1217 09:38:05.982713 4935 generic.go:334] "Generic (PLEG): container finished" podID="c2da33ef-f139-4ce2-9ef3-2a15cefcf653" containerID="264f881433ecdd2f3ca21042eb7c29e90857ad832298dbf9ea025222827205e1" exitCode=0 Dec 17 09:38:05 crc kubenswrapper[4935]: I1217 09:38:05.982794 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" event={"ID":"c2da33ef-f139-4ce2-9ef3-2a15cefcf653","Type":"ContainerDied","Data":"264f881433ecdd2f3ca21042eb7c29e90857ad832298dbf9ea025222827205e1"} Dec 17 09:38:05 crc kubenswrapper[4935]: I1217 09:38:05.991097 4935 generic.go:334] "Generic (PLEG): container finished" podID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerID="1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328" exitCode=0 Dec 17 09:38:05 crc kubenswrapper[4935]: I1217 09:38:05.991147 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp2rp" event={"ID":"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b","Type":"ContainerDied","Data":"1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328"} Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.007580 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp2rp" event={"ID":"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b","Type":"ContainerStarted","Data":"8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7"} Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.039344 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lp2rp" podStartSLOduration=2.33078374 podStartE2EDuration="5.039307432s" podCreationTimestamp="2025-12-17 09:38:02 +0000 UTC" firstStartedPulling="2025-12-17 09:38:03.95902483 +0000 UTC m=+2003.618865593" lastFinishedPulling="2025-12-17 09:38:06.667548522 +0000 UTC m=+2006.327389285" observedRunningTime="2025-12-17 09:38:07.035751266 +0000 UTC m=+2006.695592029" watchObservedRunningTime="2025-12-17 09:38:07.039307432 +0000 UTC m=+2006.699148235" Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.479817 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.654142 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-ssh-key\") pod \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.654449 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-inventory\") pod \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.654508 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4stz\" (UniqueName: \"kubernetes.io/projected/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-kube-api-access-w4stz\") pod \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\" (UID: \"c2da33ef-f139-4ce2-9ef3-2a15cefcf653\") " Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.671980 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-kube-api-access-w4stz" (OuterVolumeSpecName: "kube-api-access-w4stz") pod "c2da33ef-f139-4ce2-9ef3-2a15cefcf653" (UID: "c2da33ef-f139-4ce2-9ef3-2a15cefcf653"). InnerVolumeSpecName "kube-api-access-w4stz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.686526 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-inventory" (OuterVolumeSpecName: "inventory") pod "c2da33ef-f139-4ce2-9ef3-2a15cefcf653" (UID: "c2da33ef-f139-4ce2-9ef3-2a15cefcf653"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.717477 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c2da33ef-f139-4ce2-9ef3-2a15cefcf653" (UID: "c2da33ef-f139-4ce2-9ef3-2a15cefcf653"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.757673 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.757707 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:38:07 crc kubenswrapper[4935]: I1217 09:38:07.757718 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4stz\" (UniqueName: \"kubernetes.io/projected/c2da33ef-f139-4ce2-9ef3-2a15cefcf653-kube-api-access-w4stz\") on node \"crc\" DevicePath \"\"" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.017875 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" event={"ID":"c2da33ef-f139-4ce2-9ef3-2a15cefcf653","Type":"ContainerDied","Data":"4a790bf2354ae143d3e885a13ac8f378758678e724332c0748e6ede32a18256f"} Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.017938 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a790bf2354ae143d3e885a13ac8f378758678e724332c0748e6ede32a18256f" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.017904 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-8cgmz" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.090706 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86"] Dec 17 09:38:08 crc kubenswrapper[4935]: E1217 09:38:08.091165 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2da33ef-f139-4ce2-9ef3-2a15cefcf653" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.091188 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2da33ef-f139-4ce2-9ef3-2a15cefcf653" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.091387 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2da33ef-f139-4ce2-9ef3-2a15cefcf653" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.092060 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.094570 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.095168 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.098906 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.101853 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.106474 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86"] Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.270119 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlbr2\" (UniqueName: \"kubernetes.io/projected/63b4de7b-7933-4eba-8248-5ef0db9caa3e-kube-api-access-xlbr2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.270300 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.270377 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.372232 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlbr2\" (UniqueName: \"kubernetes.io/projected/63b4de7b-7933-4eba-8248-5ef0db9caa3e-kube-api-access-xlbr2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.372357 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.372398 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.378307 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.384306 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.395625 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlbr2\" (UniqueName: \"kubernetes.io/projected/63b4de7b-7933-4eba-8248-5ef0db9caa3e-kube-api-access-xlbr2\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.415439 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:08 crc kubenswrapper[4935]: I1217 09:38:08.988563 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86"] Dec 17 09:38:09 crc kubenswrapper[4935]: I1217 09:38:09.034173 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" event={"ID":"63b4de7b-7933-4eba-8248-5ef0db9caa3e","Type":"ContainerStarted","Data":"ea931eaf597317e22c3a5c2c34c1b7c993baa74f410a19d61a7a2dee276145e5"} Dec 17 09:38:11 crc kubenswrapper[4935]: I1217 09:38:11.055901 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" event={"ID":"63b4de7b-7933-4eba-8248-5ef0db9caa3e","Type":"ContainerStarted","Data":"eda6c07a36b1a96a3a6f25c4e7b14f74fd1001175eedfca6068f7aaf08d814c8"} Dec 17 09:38:11 crc kubenswrapper[4935]: I1217 09:38:11.093108 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" podStartSLOduration=2.042929049 podStartE2EDuration="3.093081903s" podCreationTimestamp="2025-12-17 09:38:08 +0000 UTC" firstStartedPulling="2025-12-17 09:38:09.001543597 +0000 UTC m=+2008.661384360" lastFinishedPulling="2025-12-17 09:38:10.051696451 +0000 UTC m=+2009.711537214" observedRunningTime="2025-12-17 09:38:11.079190816 +0000 UTC m=+2010.739031579" watchObservedRunningTime="2025-12-17 09:38:11.093081903 +0000 UTC m=+2010.752922656" Dec 17 09:38:12 crc kubenswrapper[4935]: I1217 09:38:12.686115 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:12 crc kubenswrapper[4935]: I1217 09:38:12.686536 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:12 crc kubenswrapper[4935]: I1217 09:38:12.733824 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:13 crc kubenswrapper[4935]: I1217 09:38:13.118851 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:13 crc kubenswrapper[4935]: I1217 09:38:13.189300 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lp2rp"] Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.090655 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lp2rp" podUID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerName="registry-server" containerID="cri-o://8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7" gracePeriod=2 Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.558846 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.749498 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-utilities\") pod \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.749694 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckh8j\" (UniqueName: \"kubernetes.io/projected/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-kube-api-access-ckh8j\") pod \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.749758 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-catalog-content\") pod \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\" (UID: \"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b\") " Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.750443 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-utilities" (OuterVolumeSpecName: "utilities") pod "6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" (UID: "6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.755462 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-kube-api-access-ckh8j" (OuterVolumeSpecName: "kube-api-access-ckh8j") pod "6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" (UID: "6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b"). InnerVolumeSpecName "kube-api-access-ckh8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.847504 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" (UID: "6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.852630 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.852657 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckh8j\" (UniqueName: \"kubernetes.io/projected/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-kube-api-access-ckh8j\") on node \"crc\" DevicePath \"\"" Dec 17 09:38:15 crc kubenswrapper[4935]: I1217 09:38:15.852667 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.104951 4935 generic.go:334] "Generic (PLEG): container finished" podID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerID="8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7" exitCode=0 Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.105008 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp2rp" event={"ID":"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b","Type":"ContainerDied","Data":"8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7"} Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.105044 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lp2rp" event={"ID":"6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b","Type":"ContainerDied","Data":"3d12ac5ddefd4ddab6b8360c921714adf3ac172cdf06a40ede41564db4f27e89"} Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.105067 4935 scope.go:117] "RemoveContainer" containerID="8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7" Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.105229 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lp2rp" Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.148641 4935 scope.go:117] "RemoveContainer" containerID="1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328" Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.169085 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lp2rp"] Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.179344 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lp2rp"] Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.180398 4935 scope.go:117] "RemoveContainer" containerID="f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49" Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.224381 4935 scope.go:117] "RemoveContainer" containerID="8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7" Dec 17 09:38:16 crc kubenswrapper[4935]: E1217 09:38:16.224958 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7\": container with ID starting with 8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7 not found: ID does not exist" containerID="8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7" Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.224994 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7"} err="failed to get container status \"8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7\": rpc error: code = NotFound desc = could not find container \"8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7\": container with ID starting with 8416f0507f94d28dc96108e8d9da503e906dbc6cca87fe86c193710d836f09b7 not found: ID does not exist" Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.225020 4935 scope.go:117] "RemoveContainer" containerID="1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328" Dec 17 09:38:16 crc kubenswrapper[4935]: E1217 09:38:16.225794 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328\": container with ID starting with 1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328 not found: ID does not exist" containerID="1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328" Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.225856 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328"} err="failed to get container status \"1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328\": rpc error: code = NotFound desc = could not find container \"1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328\": container with ID starting with 1d5c3c6096262cdf6f55354ee77b77560d0db301463806b53cfffa510d4cf328 not found: ID does not exist" Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.225897 4935 scope.go:117] "RemoveContainer" containerID="f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49" Dec 17 09:38:16 crc kubenswrapper[4935]: E1217 09:38:16.226364 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49\": container with ID starting with f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49 not found: ID does not exist" containerID="f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49" Dec 17 09:38:16 crc kubenswrapper[4935]: I1217 09:38:16.226397 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49"} err="failed to get container status \"f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49\": rpc error: code = NotFound desc = could not find container \"f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49\": container with ID starting with f631651cf4675c18a580879c5f168bd5845295564b8fee6cf9df9c00a4450c49 not found: ID does not exist" Dec 17 09:38:17 crc kubenswrapper[4935]: I1217 09:38:17.140720 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" path="/var/lib/kubelet/pods/6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b/volumes" Dec 17 09:38:20 crc kubenswrapper[4935]: I1217 09:38:20.167457 4935 generic.go:334] "Generic (PLEG): container finished" podID="63b4de7b-7933-4eba-8248-5ef0db9caa3e" containerID="eda6c07a36b1a96a3a6f25c4e7b14f74fd1001175eedfca6068f7aaf08d814c8" exitCode=0 Dec 17 09:38:20 crc kubenswrapper[4935]: I1217 09:38:20.167516 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" event={"ID":"63b4de7b-7933-4eba-8248-5ef0db9caa3e","Type":"ContainerDied","Data":"eda6c07a36b1a96a3a6f25c4e7b14f74fd1001175eedfca6068f7aaf08d814c8"} Dec 17 09:38:21 crc kubenswrapper[4935]: I1217 09:38:21.632449 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:21 crc kubenswrapper[4935]: I1217 09:38:21.794823 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-ssh-key\") pod \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " Dec 17 09:38:21 crc kubenswrapper[4935]: I1217 09:38:21.795480 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlbr2\" (UniqueName: \"kubernetes.io/projected/63b4de7b-7933-4eba-8248-5ef0db9caa3e-kube-api-access-xlbr2\") pod \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " Dec 17 09:38:21 crc kubenswrapper[4935]: I1217 09:38:21.795929 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-inventory\") pod \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\" (UID: \"63b4de7b-7933-4eba-8248-5ef0db9caa3e\") " Dec 17 09:38:21 crc kubenswrapper[4935]: I1217 09:38:21.803836 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63b4de7b-7933-4eba-8248-5ef0db9caa3e-kube-api-access-xlbr2" (OuterVolumeSpecName: "kube-api-access-xlbr2") pod "63b4de7b-7933-4eba-8248-5ef0db9caa3e" (UID: "63b4de7b-7933-4eba-8248-5ef0db9caa3e"). InnerVolumeSpecName "kube-api-access-xlbr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:38:21 crc kubenswrapper[4935]: I1217 09:38:21.831363 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-inventory" (OuterVolumeSpecName: "inventory") pod "63b4de7b-7933-4eba-8248-5ef0db9caa3e" (UID: "63b4de7b-7933-4eba-8248-5ef0db9caa3e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:38:21 crc kubenswrapper[4935]: I1217 09:38:21.842626 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "63b4de7b-7933-4eba-8248-5ef0db9caa3e" (UID: "63b4de7b-7933-4eba-8248-5ef0db9caa3e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:38:21 crc kubenswrapper[4935]: I1217 09:38:21.898906 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:38:21 crc kubenswrapper[4935]: I1217 09:38:21.898945 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/63b4de7b-7933-4eba-8248-5ef0db9caa3e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:38:21 crc kubenswrapper[4935]: I1217 09:38:21.898956 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlbr2\" (UniqueName: \"kubernetes.io/projected/63b4de7b-7933-4eba-8248-5ef0db9caa3e-kube-api-access-xlbr2\") on node \"crc\" DevicePath \"\"" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.201115 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" event={"ID":"63b4de7b-7933-4eba-8248-5ef0db9caa3e","Type":"ContainerDied","Data":"ea931eaf597317e22c3a5c2c34c1b7c993baa74f410a19d61a7a2dee276145e5"} Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.201174 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea931eaf597317e22c3a5c2c34c1b7c993baa74f410a19d61a7a2dee276145e5" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.201247 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.290109 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c"] Dec 17 09:38:22 crc kubenswrapper[4935]: E1217 09:38:22.290828 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63b4de7b-7933-4eba-8248-5ef0db9caa3e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.290859 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="63b4de7b-7933-4eba-8248-5ef0db9caa3e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:38:22 crc kubenswrapper[4935]: E1217 09:38:22.290885 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerName="extract-content" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.290893 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerName="extract-content" Dec 17 09:38:22 crc kubenswrapper[4935]: E1217 09:38:22.290912 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerName="extract-utilities" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.290922 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerName="extract-utilities" Dec 17 09:38:22 crc kubenswrapper[4935]: E1217 09:38:22.290950 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerName="registry-server" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.290958 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerName="registry-server" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.291248 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e3ea4c1-b683-4067-bdc5-a3ad8b4e8b8b" containerName="registry-server" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.291267 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="63b4de7b-7933-4eba-8248-5ef0db9caa3e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.292204 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.294646 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.294670 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.295026 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.295199 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.295044 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.300673 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.301001 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.301835 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.332018 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c"] Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.407465 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.407544 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.407676 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.407904 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.407977 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.408040 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.408099 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.408178 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.408222 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.408310 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.408436 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqh94\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-kube-api-access-vqh94\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.408518 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.408672 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.408710 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.510950 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511030 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511065 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511094 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511120 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511152 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511178 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511209 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511257 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqh94\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-kube-api-access-vqh94\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511314 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511366 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511394 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511468 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.511516 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.515521 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.515773 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.516229 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.517100 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.518817 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.520160 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.520536 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.520844 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.522725 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.522727 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.522732 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.526079 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.526159 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.534450 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqh94\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-kube-api-access-vqh94\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:22 crc kubenswrapper[4935]: I1217 09:38:22.628332 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:38:23 crc kubenswrapper[4935]: I1217 09:38:23.183607 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c"] Dec 17 09:38:23 crc kubenswrapper[4935]: I1217 09:38:23.211229 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" event={"ID":"54cfb029-5b74-4da9-b0d3-0033fe2b3968","Type":"ContainerStarted","Data":"dff3d648287db5454ddd5b0a880fe7e94bd2b936d2c9fd6321d4f06f3c340cae"} Dec 17 09:38:24 crc kubenswrapper[4935]: I1217 09:38:24.223304 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" event={"ID":"54cfb029-5b74-4da9-b0d3-0033fe2b3968","Type":"ContainerStarted","Data":"d18975dd52d0bb9f99c0ddae4e701ec1e8539823358fde5dfb383526906272c9"} Dec 17 09:38:24 crc kubenswrapper[4935]: I1217 09:38:24.254260 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" podStartSLOduration=1.582053285 podStartE2EDuration="2.254234659s" podCreationTimestamp="2025-12-17 09:38:22 +0000 UTC" firstStartedPulling="2025-12-17 09:38:23.193380286 +0000 UTC m=+2022.853221049" lastFinishedPulling="2025-12-17 09:38:23.86556166 +0000 UTC m=+2023.525402423" observedRunningTime="2025-12-17 09:38:24.251567014 +0000 UTC m=+2023.911407787" watchObservedRunningTime="2025-12-17 09:38:24.254234659 +0000 UTC m=+2023.914075442" Dec 17 09:38:30 crc kubenswrapper[4935]: I1217 09:38:30.130668 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:38:30 crc kubenswrapper[4935]: I1217 09:38:30.131162 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:39:00 crc kubenswrapper[4935]: I1217 09:39:00.130742 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:39:00 crc kubenswrapper[4935]: I1217 09:39:00.131359 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:39:03 crc kubenswrapper[4935]: I1217 09:39:03.005608 4935 generic.go:334] "Generic (PLEG): container finished" podID="54cfb029-5b74-4da9-b0d3-0033fe2b3968" containerID="d18975dd52d0bb9f99c0ddae4e701ec1e8539823358fde5dfb383526906272c9" exitCode=0 Dec 17 09:39:03 crc kubenswrapper[4935]: I1217 09:39:03.005689 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" event={"ID":"54cfb029-5b74-4da9-b0d3-0033fe2b3968","Type":"ContainerDied","Data":"d18975dd52d0bb9f99c0ddae4e701ec1e8539823358fde5dfb383526906272c9"} Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.407839 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533226 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-inventory\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533405 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqh94\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-kube-api-access-vqh94\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533485 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-neutron-metadata-combined-ca-bundle\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533584 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-repo-setup-combined-ca-bundle\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533653 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533684 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-bootstrap-combined-ca-bundle\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533727 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-nova-combined-ca-bundle\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533824 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-telemetry-combined-ca-bundle\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533858 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-libvirt-combined-ca-bundle\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533883 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533911 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.533953 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ssh-key\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.534031 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-ovn-default-certs-0\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.534116 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ovn-combined-ca-bundle\") pod \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\" (UID: \"54cfb029-5b74-4da9-b0d3-0033fe2b3968\") " Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.542800 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.542840 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.543574 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.543628 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.543783 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.543809 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.544027 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.545989 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-kube-api-access-vqh94" (OuterVolumeSpecName: "kube-api-access-vqh94") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "kube-api-access-vqh94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.547101 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.547574 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.547622 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.548163 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.569811 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-inventory" (OuterVolumeSpecName: "inventory") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.576354 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "54cfb029-5b74-4da9-b0d3-0033fe2b3968" (UID: "54cfb029-5b74-4da9-b0d3-0033fe2b3968"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.637907 4935 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.637956 4935 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.637973 4935 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.637990 4935 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.638009 4935 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.638025 4935 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.638045 4935 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.638062 4935 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.638080 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.638100 4935 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.638123 4935 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.638139 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.638154 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqh94\" (UniqueName: \"kubernetes.io/projected/54cfb029-5b74-4da9-b0d3-0033fe2b3968-kube-api-access-vqh94\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:04 crc kubenswrapper[4935]: I1217 09:39:04.638167 4935 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54cfb029-5b74-4da9-b0d3-0033fe2b3968-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.023218 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" event={"ID":"54cfb029-5b74-4da9-b0d3-0033fe2b3968","Type":"ContainerDied","Data":"dff3d648287db5454ddd5b0a880fe7e94bd2b936d2c9fd6321d4f06f3c340cae"} Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.023635 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dff3d648287db5454ddd5b0a880fe7e94bd2b936d2c9fd6321d4f06f3c340cae" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.023262 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.211066 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v"] Dec 17 09:39:05 crc kubenswrapper[4935]: E1217 09:39:05.212285 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54cfb029-5b74-4da9-b0d3-0033fe2b3968" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.212310 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="54cfb029-5b74-4da9-b0d3-0033fe2b3968" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.212820 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="54cfb029-5b74-4da9-b0d3-0033fe2b3968" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.214181 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.218990 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.219263 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.219507 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.219675 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.219818 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.235089 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v"] Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.351905 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.351982 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.352556 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.352709 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.352758 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcd9x\" (UniqueName: \"kubernetes.io/projected/e59ff9f5-6277-4150-9d1b-91d323743ab8-kube-api-access-tcd9x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.455151 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.455216 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcd9x\" (UniqueName: \"kubernetes.io/projected/e59ff9f5-6277-4150-9d1b-91d323743ab8-kube-api-access-tcd9x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.455334 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.455370 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.455400 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.456296 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.459973 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.461332 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.461845 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.474782 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcd9x\" (UniqueName: \"kubernetes.io/projected/e59ff9f5-6277-4150-9d1b-91d323743ab8-kube-api-access-tcd9x\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-z6j8v\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:05 crc kubenswrapper[4935]: I1217 09:39:05.544508 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:39:06 crc kubenswrapper[4935]: I1217 09:39:06.101496 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v"] Dec 17 09:39:06 crc kubenswrapper[4935]: W1217 09:39:06.113149 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode59ff9f5_6277_4150_9d1b_91d323743ab8.slice/crio-ba9c3890206806831f2f6ea5a2fb1f4124fde75ed8c89cb877407a97dd1b3e62 WatchSource:0}: Error finding container ba9c3890206806831f2f6ea5a2fb1f4124fde75ed8c89cb877407a97dd1b3e62: Status 404 returned error can't find the container with id ba9c3890206806831f2f6ea5a2fb1f4124fde75ed8c89cb877407a97dd1b3e62 Dec 17 09:39:07 crc kubenswrapper[4935]: I1217 09:39:07.043353 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" event={"ID":"e59ff9f5-6277-4150-9d1b-91d323743ab8","Type":"ContainerStarted","Data":"5a0dcd5735a966208519c7d515d2df2402df3b92084512c4d6ce7ce0a620b7f2"} Dec 17 09:39:07 crc kubenswrapper[4935]: I1217 09:39:07.044067 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" event={"ID":"e59ff9f5-6277-4150-9d1b-91d323743ab8","Type":"ContainerStarted","Data":"ba9c3890206806831f2f6ea5a2fb1f4124fde75ed8c89cb877407a97dd1b3e62"} Dec 17 09:39:07 crc kubenswrapper[4935]: I1217 09:39:07.071421 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" podStartSLOduration=1.422124997 podStartE2EDuration="2.071397105s" podCreationTimestamp="2025-12-17 09:39:05 +0000 UTC" firstStartedPulling="2025-12-17 09:39:06.117914791 +0000 UTC m=+2065.777755554" lastFinishedPulling="2025-12-17 09:39:06.767186869 +0000 UTC m=+2066.427027662" observedRunningTime="2025-12-17 09:39:07.064451597 +0000 UTC m=+2066.724292420" watchObservedRunningTime="2025-12-17 09:39:07.071397105 +0000 UTC m=+2066.731237888" Dec 17 09:39:30 crc kubenswrapper[4935]: I1217 09:39:30.130507 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:39:30 crc kubenswrapper[4935]: I1217 09:39:30.131154 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:39:30 crc kubenswrapper[4935]: I1217 09:39:30.131207 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:39:30 crc kubenswrapper[4935]: I1217 09:39:30.131910 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a83a24692330d7443f7f4ba2368305dece4f6efd7ec6cd4d7ebd5ce65b90336c"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:39:30 crc kubenswrapper[4935]: I1217 09:39:30.131969 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://a83a24692330d7443f7f4ba2368305dece4f6efd7ec6cd4d7ebd5ce65b90336c" gracePeriod=600 Dec 17 09:39:30 crc kubenswrapper[4935]: I1217 09:39:30.276457 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="a83a24692330d7443f7f4ba2368305dece4f6efd7ec6cd4d7ebd5ce65b90336c" exitCode=0 Dec 17 09:39:30 crc kubenswrapper[4935]: I1217 09:39:30.276562 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"a83a24692330d7443f7f4ba2368305dece4f6efd7ec6cd4d7ebd5ce65b90336c"} Dec 17 09:39:30 crc kubenswrapper[4935]: I1217 09:39:30.276909 4935 scope.go:117] "RemoveContainer" containerID="28fecfb793e47e83b72928a24538f8189b3513306e098d8da77b9a52af1c41d9" Dec 17 09:39:31 crc kubenswrapper[4935]: I1217 09:39:31.287378 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c"} Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.050716 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rlgm6"] Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.055925 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.087989 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rlgm6"] Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.108905 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-catalog-content\") pod \"community-operators-rlgm6\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.108985 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d5sv\" (UniqueName: \"kubernetes.io/projected/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-kube-api-access-4d5sv\") pod \"community-operators-rlgm6\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.109109 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-utilities\") pod \"community-operators-rlgm6\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.211146 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-catalog-content\") pod \"community-operators-rlgm6\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.211760 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d5sv\" (UniqueName: \"kubernetes.io/projected/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-kube-api-access-4d5sv\") pod \"community-operators-rlgm6\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.212047 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-utilities\") pod \"community-operators-rlgm6\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.212744 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-utilities\") pod \"community-operators-rlgm6\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.213585 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-catalog-content\") pod \"community-operators-rlgm6\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.239139 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d5sv\" (UniqueName: \"kubernetes.io/projected/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-kube-api-access-4d5sv\") pod \"community-operators-rlgm6\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.385575 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.646077 4935 generic.go:334] "Generic (PLEG): container finished" podID="e59ff9f5-6277-4150-9d1b-91d323743ab8" containerID="5a0dcd5735a966208519c7d515d2df2402df3b92084512c4d6ce7ce0a620b7f2" exitCode=0 Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.646179 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" event={"ID":"e59ff9f5-6277-4150-9d1b-91d323743ab8","Type":"ContainerDied","Data":"5a0dcd5735a966208519c7d515d2df2402df3b92084512c4d6ce7ce0a620b7f2"} Dec 17 09:40:11 crc kubenswrapper[4935]: I1217 09:40:11.974587 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rlgm6"] Dec 17 09:40:12 crc kubenswrapper[4935]: I1217 09:40:12.658257 4935 generic.go:334] "Generic (PLEG): container finished" podID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerID="0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309" exitCode=0 Dec 17 09:40:12 crc kubenswrapper[4935]: I1217 09:40:12.658405 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlgm6" event={"ID":"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b","Type":"ContainerDied","Data":"0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309"} Dec 17 09:40:12 crc kubenswrapper[4935]: I1217 09:40:12.658883 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlgm6" event={"ID":"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b","Type":"ContainerStarted","Data":"245b4c1f035f3a39cb19dabcc008c6bc02393879fea30193e8c62cad387c5737"} Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.095412 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.157227 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-inventory\") pod \"e59ff9f5-6277-4150-9d1b-91d323743ab8\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.157335 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ssh-key\") pod \"e59ff9f5-6277-4150-9d1b-91d323743ab8\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.157536 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovn-combined-ca-bundle\") pod \"e59ff9f5-6277-4150-9d1b-91d323743ab8\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.157597 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovncontroller-config-0\") pod \"e59ff9f5-6277-4150-9d1b-91d323743ab8\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.157695 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcd9x\" (UniqueName: \"kubernetes.io/projected/e59ff9f5-6277-4150-9d1b-91d323743ab8-kube-api-access-tcd9x\") pod \"e59ff9f5-6277-4150-9d1b-91d323743ab8\" (UID: \"e59ff9f5-6277-4150-9d1b-91d323743ab8\") " Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.163648 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e59ff9f5-6277-4150-9d1b-91d323743ab8" (UID: "e59ff9f5-6277-4150-9d1b-91d323743ab8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.164113 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e59ff9f5-6277-4150-9d1b-91d323743ab8-kube-api-access-tcd9x" (OuterVolumeSpecName: "kube-api-access-tcd9x") pod "e59ff9f5-6277-4150-9d1b-91d323743ab8" (UID: "e59ff9f5-6277-4150-9d1b-91d323743ab8"). InnerVolumeSpecName "kube-api-access-tcd9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.187972 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "e59ff9f5-6277-4150-9d1b-91d323743ab8" (UID: "e59ff9f5-6277-4150-9d1b-91d323743ab8"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.188330 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-inventory" (OuterVolumeSpecName: "inventory") pod "e59ff9f5-6277-4150-9d1b-91d323743ab8" (UID: "e59ff9f5-6277-4150-9d1b-91d323743ab8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.194462 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e59ff9f5-6277-4150-9d1b-91d323743ab8" (UID: "e59ff9f5-6277-4150-9d1b-91d323743ab8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.259763 4935 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.259800 4935 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/e59ff9f5-6277-4150-9d1b-91d323743ab8-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.259810 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcd9x\" (UniqueName: \"kubernetes.io/projected/e59ff9f5-6277-4150-9d1b-91d323743ab8-kube-api-access-tcd9x\") on node \"crc\" DevicePath \"\"" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.259821 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.259830 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e59ff9f5-6277-4150-9d1b-91d323743ab8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.672102 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" event={"ID":"e59ff9f5-6277-4150-9d1b-91d323743ab8","Type":"ContainerDied","Data":"ba9c3890206806831f2f6ea5a2fb1f4124fde75ed8c89cb877407a97dd1b3e62"} Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.672609 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba9c3890206806831f2f6ea5a2fb1f4124fde75ed8c89cb877407a97dd1b3e62" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.672138 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-z6j8v" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.675673 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlgm6" event={"ID":"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b","Type":"ContainerStarted","Data":"52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1"} Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.805435 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6"] Dec 17 09:40:13 crc kubenswrapper[4935]: E1217 09:40:13.806438 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e59ff9f5-6277-4150-9d1b-91d323743ab8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.806824 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e59ff9f5-6277-4150-9d1b-91d323743ab8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.807128 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="e59ff9f5-6277-4150-9d1b-91d323743ab8" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.809088 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.813962 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.814059 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.814109 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.814263 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.814467 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.814849 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.843258 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6"] Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.876664 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.876859 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.876943 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8zhf\" (UniqueName: \"kubernetes.io/projected/4454b07b-03d5-46e3-8277-232e491c91c1-kube-api-access-r8zhf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.876991 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.877024 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.877107 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.977997 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.978069 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8zhf\" (UniqueName: \"kubernetes.io/projected/4454b07b-03d5-46e3-8277-232e491c91c1-kube-api-access-r8zhf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.978105 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.978129 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.978183 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.978219 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.984559 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.984960 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.985255 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.986365 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.987006 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:13 crc kubenswrapper[4935]: I1217 09:40:13.993626 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8zhf\" (UniqueName: \"kubernetes.io/projected/4454b07b-03d5-46e3-8277-232e491c91c1-kube-api-access-r8zhf\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:14 crc kubenswrapper[4935]: I1217 09:40:14.132954 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:40:14 crc kubenswrapper[4935]: I1217 09:40:14.689979 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlgm6" event={"ID":"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b","Type":"ContainerDied","Data":"52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1"} Dec 17 09:40:14 crc kubenswrapper[4935]: I1217 09:40:14.689782 4935 generic.go:334] "Generic (PLEG): container finished" podID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerID="52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1" exitCode=0 Dec 17 09:40:14 crc kubenswrapper[4935]: I1217 09:40:14.696961 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6"] Dec 17 09:40:15 crc kubenswrapper[4935]: I1217 09:40:15.703374 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlgm6" event={"ID":"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b","Type":"ContainerStarted","Data":"35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6"} Dec 17 09:40:15 crc kubenswrapper[4935]: I1217 09:40:15.704631 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" event={"ID":"4454b07b-03d5-46e3-8277-232e491c91c1","Type":"ContainerStarted","Data":"975870d47b4aaa9cd35de5698614235d66ed820c2b1676ea874558f5dafd8d68"} Dec 17 09:40:15 crc kubenswrapper[4935]: I1217 09:40:15.704658 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" event={"ID":"4454b07b-03d5-46e3-8277-232e491c91c1","Type":"ContainerStarted","Data":"97584123445c12df503e2b7e861c743939f9f6ebbe92a67102eff0aef898e89f"} Dec 17 09:40:15 crc kubenswrapper[4935]: I1217 09:40:15.731224 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rlgm6" podStartSLOduration=2.255077236 podStartE2EDuration="4.73119534s" podCreationTimestamp="2025-12-17 09:40:11 +0000 UTC" firstStartedPulling="2025-12-17 09:40:12.662408923 +0000 UTC m=+2132.322249696" lastFinishedPulling="2025-12-17 09:40:15.138527037 +0000 UTC m=+2134.798367800" observedRunningTime="2025-12-17 09:40:15.722670161 +0000 UTC m=+2135.382510944" watchObservedRunningTime="2025-12-17 09:40:15.73119534 +0000 UTC m=+2135.391036103" Dec 17 09:40:15 crc kubenswrapper[4935]: I1217 09:40:15.752070 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" podStartSLOduration=2.056976986 podStartE2EDuration="2.752038629s" podCreationTimestamp="2025-12-17 09:40:13 +0000 UTC" firstStartedPulling="2025-12-17 09:40:14.700029533 +0000 UTC m=+2134.359870296" lastFinishedPulling="2025-12-17 09:40:15.395091166 +0000 UTC m=+2135.054931939" observedRunningTime="2025-12-17 09:40:15.746852053 +0000 UTC m=+2135.406692816" watchObservedRunningTime="2025-12-17 09:40:15.752038629 +0000 UTC m=+2135.411879412" Dec 17 09:40:21 crc kubenswrapper[4935]: I1217 09:40:21.386395 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:21 crc kubenswrapper[4935]: I1217 09:40:21.387100 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:21 crc kubenswrapper[4935]: I1217 09:40:21.431239 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:21 crc kubenswrapper[4935]: I1217 09:40:21.813707 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:21 crc kubenswrapper[4935]: I1217 09:40:21.877039 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rlgm6"] Dec 17 09:40:23 crc kubenswrapper[4935]: I1217 09:40:23.783953 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rlgm6" podUID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerName="registry-server" containerID="cri-o://35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6" gracePeriod=2 Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.260672 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.428961 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-utilities\") pod \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.429431 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-catalog-content\") pod \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.429568 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d5sv\" (UniqueName: \"kubernetes.io/projected/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-kube-api-access-4d5sv\") pod \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\" (UID: \"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b\") " Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.430553 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-utilities" (OuterVolumeSpecName: "utilities") pod "d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" (UID: "d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.435536 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-kube-api-access-4d5sv" (OuterVolumeSpecName: "kube-api-access-4d5sv") pod "d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" (UID: "d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b"). InnerVolumeSpecName "kube-api-access-4d5sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.492011 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" (UID: "d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.532365 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.532400 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.532410 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d5sv\" (UniqueName: \"kubernetes.io/projected/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b-kube-api-access-4d5sv\") on node \"crc\" DevicePath \"\"" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.823263 4935 generic.go:334] "Generic (PLEG): container finished" podID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerID="35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6" exitCode=0 Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.823302 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlgm6" event={"ID":"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b","Type":"ContainerDied","Data":"35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6"} Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.823358 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rlgm6" event={"ID":"d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b","Type":"ContainerDied","Data":"245b4c1f035f3a39cb19dabcc008c6bc02393879fea30193e8c62cad387c5737"} Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.823362 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rlgm6" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.823379 4935 scope.go:117] "RemoveContainer" containerID="35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.863125 4935 scope.go:117] "RemoveContainer" containerID="52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.880122 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rlgm6"] Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.891045 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rlgm6"] Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.903637 4935 scope.go:117] "RemoveContainer" containerID="0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.940609 4935 scope.go:117] "RemoveContainer" containerID="35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6" Dec 17 09:40:24 crc kubenswrapper[4935]: E1217 09:40:24.941046 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6\": container with ID starting with 35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6 not found: ID does not exist" containerID="35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.941104 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6"} err="failed to get container status \"35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6\": rpc error: code = NotFound desc = could not find container \"35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6\": container with ID starting with 35e99f44ab9a29433154cfce71100a06bb2b5222ab9e2fa5885f9aacee9f11f6 not found: ID does not exist" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.941139 4935 scope.go:117] "RemoveContainer" containerID="52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1" Dec 17 09:40:24 crc kubenswrapper[4935]: E1217 09:40:24.941439 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1\": container with ID starting with 52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1 not found: ID does not exist" containerID="52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.941472 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1"} err="failed to get container status \"52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1\": rpc error: code = NotFound desc = could not find container \"52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1\": container with ID starting with 52ec9c7540392e200de5b32b132ce2699ff3d49071df786cd6ec457030f435a1 not found: ID does not exist" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.941492 4935 scope.go:117] "RemoveContainer" containerID="0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309" Dec 17 09:40:24 crc kubenswrapper[4935]: E1217 09:40:24.941719 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309\": container with ID starting with 0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309 not found: ID does not exist" containerID="0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309" Dec 17 09:40:24 crc kubenswrapper[4935]: I1217 09:40:24.941747 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309"} err="failed to get container status \"0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309\": rpc error: code = NotFound desc = could not find container \"0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309\": container with ID starting with 0d18e0c27f67d9191a012a44c48027a73ddc1990d7f5a52d3f01b1b2d511b309 not found: ID does not exist" Dec 17 09:40:25 crc kubenswrapper[4935]: I1217 09:40:25.139405 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" path="/var/lib/kubelet/pods/d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b/volumes" Dec 17 09:41:07 crc kubenswrapper[4935]: I1217 09:41:07.255143 4935 generic.go:334] "Generic (PLEG): container finished" podID="4454b07b-03d5-46e3-8277-232e491c91c1" containerID="975870d47b4aaa9cd35de5698614235d66ed820c2b1676ea874558f5dafd8d68" exitCode=0 Dec 17 09:41:07 crc kubenswrapper[4935]: I1217 09:41:07.255285 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" event={"ID":"4454b07b-03d5-46e3-8277-232e491c91c1","Type":"ContainerDied","Data":"975870d47b4aaa9cd35de5698614235d66ed820c2b1676ea874558f5dafd8d68"} Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.730089 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.792992 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-nova-metadata-neutron-config-0\") pod \"4454b07b-03d5-46e3-8277-232e491c91c1\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.793162 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-metadata-combined-ca-bundle\") pod \"4454b07b-03d5-46e3-8277-232e491c91c1\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.793202 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-inventory\") pod \"4454b07b-03d5-46e3-8277-232e491c91c1\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.793252 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4454b07b-03d5-46e3-8277-232e491c91c1\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.793293 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-ssh-key\") pod \"4454b07b-03d5-46e3-8277-232e491c91c1\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.793426 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8zhf\" (UniqueName: \"kubernetes.io/projected/4454b07b-03d5-46e3-8277-232e491c91c1-kube-api-access-r8zhf\") pod \"4454b07b-03d5-46e3-8277-232e491c91c1\" (UID: \"4454b07b-03d5-46e3-8277-232e491c91c1\") " Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.799749 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4454b07b-03d5-46e3-8277-232e491c91c1" (UID: "4454b07b-03d5-46e3-8277-232e491c91c1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.800590 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4454b07b-03d5-46e3-8277-232e491c91c1-kube-api-access-r8zhf" (OuterVolumeSpecName: "kube-api-access-r8zhf") pod "4454b07b-03d5-46e3-8277-232e491c91c1" (UID: "4454b07b-03d5-46e3-8277-232e491c91c1"). InnerVolumeSpecName "kube-api-access-r8zhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.829499 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4454b07b-03d5-46e3-8277-232e491c91c1" (UID: "4454b07b-03d5-46e3-8277-232e491c91c1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.830012 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4454b07b-03d5-46e3-8277-232e491c91c1" (UID: "4454b07b-03d5-46e3-8277-232e491c91c1"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.831109 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4454b07b-03d5-46e3-8277-232e491c91c1" (UID: "4454b07b-03d5-46e3-8277-232e491c91c1"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.832085 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-inventory" (OuterVolumeSpecName: "inventory") pod "4454b07b-03d5-46e3-8277-232e491c91c1" (UID: "4454b07b-03d5-46e3-8277-232e491c91c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.897287 4935 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.897435 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.897498 4935 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.897568 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.897627 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8zhf\" (UniqueName: \"kubernetes.io/projected/4454b07b-03d5-46e3-8277-232e491c91c1-kube-api-access-r8zhf\") on node \"crc\" DevicePath \"\"" Dec 17 09:41:08 crc kubenswrapper[4935]: I1217 09:41:08.897688 4935 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4454b07b-03d5-46e3-8277-232e491c91c1-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.279476 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" event={"ID":"4454b07b-03d5-46e3-8277-232e491c91c1","Type":"ContainerDied","Data":"97584123445c12df503e2b7e861c743939f9f6ebbe92a67102eff0aef898e89f"} Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.279569 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97584123445c12df503e2b7e861c743939f9f6ebbe92a67102eff0aef898e89f" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.279565 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.404725 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv"] Dec 17 09:41:09 crc kubenswrapper[4935]: E1217 09:41:09.405530 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerName="extract-content" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.405560 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerName="extract-content" Dec 17 09:41:09 crc kubenswrapper[4935]: E1217 09:41:09.405605 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerName="extract-utilities" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.405615 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerName="extract-utilities" Dec 17 09:41:09 crc kubenswrapper[4935]: E1217 09:41:09.405634 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4454b07b-03d5-46e3-8277-232e491c91c1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.405646 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="4454b07b-03d5-46e3-8277-232e491c91c1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 17 09:41:09 crc kubenswrapper[4935]: E1217 09:41:09.405661 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerName="registry-server" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.405667 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerName="registry-server" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.405861 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="d60d603a-9f2f-4ea7-8c53-1c3c6fa4698b" containerName="registry-server" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.405883 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="4454b07b-03d5-46e3-8277-232e491c91c1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.406629 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.409765 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.410126 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.410312 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.411869 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.414195 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.421324 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv"] Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.510653 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.510750 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rk4q\" (UniqueName: \"kubernetes.io/projected/61772212-3ef5-4d2a-91be-96cd39dbb4e3-kube-api-access-6rk4q\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.510797 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.510822 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.510973 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.612520 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.612625 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rk4q\" (UniqueName: \"kubernetes.io/projected/61772212-3ef5-4d2a-91be-96cd39dbb4e3-kube-api-access-6rk4q\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.612685 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.612727 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.612923 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.618631 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.619088 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.619690 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.620045 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.632384 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rk4q\" (UniqueName: \"kubernetes.io/projected/61772212-3ef5-4d2a-91be-96cd39dbb4e3-kube-api-access-6rk4q\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-znsdv\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.735159 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.761050 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6vwv6"] Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.763352 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.800866 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vwv6"] Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.816809 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-catalog-content\") pod \"certified-operators-6vwv6\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.816918 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-utilities\") pod \"certified-operators-6vwv6\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.816978 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwfv7\" (UniqueName: \"kubernetes.io/projected/612ab21c-d6af-42d3-a8e7-871481d17e4f-kube-api-access-vwfv7\") pod \"certified-operators-6vwv6\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.919454 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-catalog-content\") pod \"certified-operators-6vwv6\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.920040 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-utilities\") pod \"certified-operators-6vwv6\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.920131 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-catalog-content\") pod \"certified-operators-6vwv6\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.920247 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwfv7\" (UniqueName: \"kubernetes.io/projected/612ab21c-d6af-42d3-a8e7-871481d17e4f-kube-api-access-vwfv7\") pod \"certified-operators-6vwv6\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.920476 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-utilities\") pod \"certified-operators-6vwv6\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:09 crc kubenswrapper[4935]: I1217 09:41:09.944846 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwfv7\" (UniqueName: \"kubernetes.io/projected/612ab21c-d6af-42d3-a8e7-871481d17e4f-kube-api-access-vwfv7\") pod \"certified-operators-6vwv6\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:10 crc kubenswrapper[4935]: I1217 09:41:10.189445 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:10 crc kubenswrapper[4935]: I1217 09:41:10.443549 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv"] Dec 17 09:41:10 crc kubenswrapper[4935]: I1217 09:41:10.554679 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6vwv6"] Dec 17 09:41:11 crc kubenswrapper[4935]: I1217 09:41:11.319260 4935 generic.go:334] "Generic (PLEG): container finished" podID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerID="21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae" exitCode=0 Dec 17 09:41:11 crc kubenswrapper[4935]: I1217 09:41:11.320035 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vwv6" event={"ID":"612ab21c-d6af-42d3-a8e7-871481d17e4f","Type":"ContainerDied","Data":"21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae"} Dec 17 09:41:11 crc kubenswrapper[4935]: I1217 09:41:11.320067 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vwv6" event={"ID":"612ab21c-d6af-42d3-a8e7-871481d17e4f","Type":"ContainerStarted","Data":"984291b628e68e3ec4518cc596a9560e1b8be3d7d58b7a47874e17a8c584a6da"} Dec 17 09:41:11 crc kubenswrapper[4935]: I1217 09:41:11.321705 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" event={"ID":"61772212-3ef5-4d2a-91be-96cd39dbb4e3","Type":"ContainerStarted","Data":"236b7e26af47f20088251afba13422f48ea3b7116296c0b8f8b0bb2240739362"} Dec 17 09:41:12 crc kubenswrapper[4935]: I1217 09:41:12.333641 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" event={"ID":"61772212-3ef5-4d2a-91be-96cd39dbb4e3","Type":"ContainerStarted","Data":"9ff61d330dd44a3deaadbfab3cafaa1a9ee1e869d963408326cfe837ac885970"} Dec 17 09:41:12 crc kubenswrapper[4935]: I1217 09:41:12.357957 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" podStartSLOduration=2.644271363 podStartE2EDuration="3.357936791s" podCreationTimestamp="2025-12-17 09:41:09 +0000 UTC" firstStartedPulling="2025-12-17 09:41:10.478302131 +0000 UTC m=+2190.138142894" lastFinishedPulling="2025-12-17 09:41:11.191967569 +0000 UTC m=+2190.851808322" observedRunningTime="2025-12-17 09:41:12.349924415 +0000 UTC m=+2192.009765178" watchObservedRunningTime="2025-12-17 09:41:12.357936791 +0000 UTC m=+2192.017777554" Dec 17 09:41:13 crc kubenswrapper[4935]: I1217 09:41:13.350077 4935 generic.go:334] "Generic (PLEG): container finished" podID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerID="45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae" exitCode=0 Dec 17 09:41:13 crc kubenswrapper[4935]: I1217 09:41:13.350256 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vwv6" event={"ID":"612ab21c-d6af-42d3-a8e7-871481d17e4f","Type":"ContainerDied","Data":"45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae"} Dec 17 09:41:15 crc kubenswrapper[4935]: I1217 09:41:15.369753 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vwv6" event={"ID":"612ab21c-d6af-42d3-a8e7-871481d17e4f","Type":"ContainerStarted","Data":"655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c"} Dec 17 09:41:15 crc kubenswrapper[4935]: I1217 09:41:15.398058 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6vwv6" podStartSLOduration=2.800795767 podStartE2EDuration="6.398041597s" podCreationTimestamp="2025-12-17 09:41:09 +0000 UTC" firstStartedPulling="2025-12-17 09:41:11.321744931 +0000 UTC m=+2190.981585704" lastFinishedPulling="2025-12-17 09:41:14.918990771 +0000 UTC m=+2194.578831534" observedRunningTime="2025-12-17 09:41:15.394159993 +0000 UTC m=+2195.054000756" watchObservedRunningTime="2025-12-17 09:41:15.398041597 +0000 UTC m=+2195.057882360" Dec 17 09:41:20 crc kubenswrapper[4935]: I1217 09:41:20.190701 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:20 crc kubenswrapper[4935]: I1217 09:41:20.191326 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:20 crc kubenswrapper[4935]: I1217 09:41:20.234990 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:20 crc kubenswrapper[4935]: I1217 09:41:20.473348 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:20 crc kubenswrapper[4935]: I1217 09:41:20.523902 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vwv6"] Dec 17 09:41:22 crc kubenswrapper[4935]: I1217 09:41:22.432894 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6vwv6" podUID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerName="registry-server" containerID="cri-o://655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c" gracePeriod=2 Dec 17 09:41:22 crc kubenswrapper[4935]: I1217 09:41:22.916798 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.099566 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwfv7\" (UniqueName: \"kubernetes.io/projected/612ab21c-d6af-42d3-a8e7-871481d17e4f-kube-api-access-vwfv7\") pod \"612ab21c-d6af-42d3-a8e7-871481d17e4f\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.099666 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-catalog-content\") pod \"612ab21c-d6af-42d3-a8e7-871481d17e4f\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.099823 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-utilities\") pod \"612ab21c-d6af-42d3-a8e7-871481d17e4f\" (UID: \"612ab21c-d6af-42d3-a8e7-871481d17e4f\") " Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.100894 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-utilities" (OuterVolumeSpecName: "utilities") pod "612ab21c-d6af-42d3-a8e7-871481d17e4f" (UID: "612ab21c-d6af-42d3-a8e7-871481d17e4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.106123 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/612ab21c-d6af-42d3-a8e7-871481d17e4f-kube-api-access-vwfv7" (OuterVolumeSpecName: "kube-api-access-vwfv7") pod "612ab21c-d6af-42d3-a8e7-871481d17e4f" (UID: "612ab21c-d6af-42d3-a8e7-871481d17e4f"). InnerVolumeSpecName "kube-api-access-vwfv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.202794 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwfv7\" (UniqueName: \"kubernetes.io/projected/612ab21c-d6af-42d3-a8e7-871481d17e4f-kube-api-access-vwfv7\") on node \"crc\" DevicePath \"\"" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.202834 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.457425 4935 generic.go:334] "Generic (PLEG): container finished" podID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerID="655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c" exitCode=0 Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.457820 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6vwv6" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.457846 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vwv6" event={"ID":"612ab21c-d6af-42d3-a8e7-871481d17e4f","Type":"ContainerDied","Data":"655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c"} Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.457890 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6vwv6" event={"ID":"612ab21c-d6af-42d3-a8e7-871481d17e4f","Type":"ContainerDied","Data":"984291b628e68e3ec4518cc596a9560e1b8be3d7d58b7a47874e17a8c584a6da"} Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.457917 4935 scope.go:117] "RemoveContainer" containerID="655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.480370 4935 scope.go:117] "RemoveContainer" containerID="45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.501920 4935 scope.go:117] "RemoveContainer" containerID="21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.537904 4935 scope.go:117] "RemoveContainer" containerID="655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c" Dec 17 09:41:23 crc kubenswrapper[4935]: E1217 09:41:23.538387 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c\": container with ID starting with 655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c not found: ID does not exist" containerID="655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.538464 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c"} err="failed to get container status \"655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c\": rpc error: code = NotFound desc = could not find container \"655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c\": container with ID starting with 655274cfcc5aab842ea4a027c0784da3d660f0cd652ca41cf8b69d4f6fa4541c not found: ID does not exist" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.538493 4935 scope.go:117] "RemoveContainer" containerID="45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae" Dec 17 09:41:23 crc kubenswrapper[4935]: E1217 09:41:23.538769 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae\": container with ID starting with 45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae not found: ID does not exist" containerID="45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.538805 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae"} err="failed to get container status \"45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae\": rpc error: code = NotFound desc = could not find container \"45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae\": container with ID starting with 45779adb00854deb9632c66df7f372b84838fb9047240a1b7cf6c5e5bd6473ae not found: ID does not exist" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.538830 4935 scope.go:117] "RemoveContainer" containerID="21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae" Dec 17 09:41:23 crc kubenswrapper[4935]: E1217 09:41:23.539003 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae\": container with ID starting with 21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae not found: ID does not exist" containerID="21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.539026 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae"} err="failed to get container status \"21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae\": rpc error: code = NotFound desc = could not find container \"21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae\": container with ID starting with 21157720fffe97f617dd70bc25d0d076cadc8144956c76c5f2b3d683572269ae not found: ID does not exist" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.854148 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "612ab21c-d6af-42d3-a8e7-871481d17e4f" (UID: "612ab21c-d6af-42d3-a8e7-871481d17e4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:41:23 crc kubenswrapper[4935]: I1217 09:41:23.919473 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/612ab21c-d6af-42d3-a8e7-871481d17e4f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:41:24 crc kubenswrapper[4935]: I1217 09:41:24.090563 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6vwv6"] Dec 17 09:41:24 crc kubenswrapper[4935]: I1217 09:41:24.099593 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6vwv6"] Dec 17 09:41:25 crc kubenswrapper[4935]: I1217 09:41:25.136132 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="612ab21c-d6af-42d3-a8e7-871481d17e4f" path="/var/lib/kubelet/pods/612ab21c-d6af-42d3-a8e7-871481d17e4f/volumes" Dec 17 09:41:30 crc kubenswrapper[4935]: I1217 09:41:30.131221 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:41:30 crc kubenswrapper[4935]: I1217 09:41:30.131859 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:42:00 crc kubenswrapper[4935]: I1217 09:42:00.130470 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:42:00 crc kubenswrapper[4935]: I1217 09:42:00.131233 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.399370 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kg62l"] Dec 17 09:42:29 crc kubenswrapper[4935]: E1217 09:42:29.400862 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerName="extract-content" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.400883 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerName="extract-content" Dec 17 09:42:29 crc kubenswrapper[4935]: E1217 09:42:29.400925 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerName="registry-server" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.400932 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerName="registry-server" Dec 17 09:42:29 crc kubenswrapper[4935]: E1217 09:42:29.400954 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerName="extract-utilities" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.400962 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerName="extract-utilities" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.401180 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="612ab21c-d6af-42d3-a8e7-871481d17e4f" containerName="registry-server" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.405942 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.425651 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kg62l"] Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.551917 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfbt6\" (UniqueName: \"kubernetes.io/projected/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-kube-api-access-cfbt6\") pod \"redhat-marketplace-kg62l\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.552227 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-utilities\") pod \"redhat-marketplace-kg62l\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.552389 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-catalog-content\") pod \"redhat-marketplace-kg62l\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.654958 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfbt6\" (UniqueName: \"kubernetes.io/projected/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-kube-api-access-cfbt6\") pod \"redhat-marketplace-kg62l\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.655019 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-utilities\") pod \"redhat-marketplace-kg62l\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.655067 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-catalog-content\") pod \"redhat-marketplace-kg62l\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.655595 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-utilities\") pod \"redhat-marketplace-kg62l\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.655626 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-catalog-content\") pod \"redhat-marketplace-kg62l\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.694489 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfbt6\" (UniqueName: \"kubernetes.io/projected/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-kube-api-access-cfbt6\") pod \"redhat-marketplace-kg62l\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:29 crc kubenswrapper[4935]: I1217 09:42:29.734316 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:30 crc kubenswrapper[4935]: I1217 09:42:30.130947 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:42:30 crc kubenswrapper[4935]: I1217 09:42:30.131224 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:42:30 crc kubenswrapper[4935]: I1217 09:42:30.131292 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:42:30 crc kubenswrapper[4935]: I1217 09:42:30.131839 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:42:30 crc kubenswrapper[4935]: I1217 09:42:30.131892 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" gracePeriod=600 Dec 17 09:42:30 crc kubenswrapper[4935]: I1217 09:42:30.218719 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kg62l"] Dec 17 09:42:30 crc kubenswrapper[4935]: E1217 09:42:30.253617 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:42:31 crc kubenswrapper[4935]: I1217 09:42:31.052206 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" exitCode=0 Dec 17 09:42:31 crc kubenswrapper[4935]: I1217 09:42:31.052313 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c"} Dec 17 09:42:31 crc kubenswrapper[4935]: I1217 09:42:31.052713 4935 scope.go:117] "RemoveContainer" containerID="a83a24692330d7443f7f4ba2368305dece4f6efd7ec6cd4d7ebd5ce65b90336c" Dec 17 09:42:31 crc kubenswrapper[4935]: I1217 09:42:31.053733 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:42:31 crc kubenswrapper[4935]: E1217 09:42:31.054293 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:42:31 crc kubenswrapper[4935]: I1217 09:42:31.054672 4935 generic.go:334] "Generic (PLEG): container finished" podID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerID="040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c" exitCode=0 Dec 17 09:42:31 crc kubenswrapper[4935]: I1217 09:42:31.054754 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg62l" event={"ID":"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6","Type":"ContainerDied","Data":"040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c"} Dec 17 09:42:31 crc kubenswrapper[4935]: I1217 09:42:31.054797 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg62l" event={"ID":"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6","Type":"ContainerStarted","Data":"dc68ac1623bb10f70e8ea5c6675508d560eeb18ad2fbcc41aa6af08bb84d5ace"} Dec 17 09:42:33 crc kubenswrapper[4935]: I1217 09:42:33.077676 4935 generic.go:334] "Generic (PLEG): container finished" podID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerID="b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7" exitCode=0 Dec 17 09:42:33 crc kubenswrapper[4935]: I1217 09:42:33.077748 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg62l" event={"ID":"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6","Type":"ContainerDied","Data":"b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7"} Dec 17 09:42:34 crc kubenswrapper[4935]: I1217 09:42:34.088746 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg62l" event={"ID":"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6","Type":"ContainerStarted","Data":"292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618"} Dec 17 09:42:34 crc kubenswrapper[4935]: I1217 09:42:34.116579 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kg62l" podStartSLOduration=2.362171667 podStartE2EDuration="5.116558152s" podCreationTimestamp="2025-12-17 09:42:29 +0000 UTC" firstStartedPulling="2025-12-17 09:42:31.056952969 +0000 UTC m=+2270.716793722" lastFinishedPulling="2025-12-17 09:42:33.811339434 +0000 UTC m=+2273.471180207" observedRunningTime="2025-12-17 09:42:34.114888961 +0000 UTC m=+2273.774729724" watchObservedRunningTime="2025-12-17 09:42:34.116558152 +0000 UTC m=+2273.776398915" Dec 17 09:42:39 crc kubenswrapper[4935]: I1217 09:42:39.734530 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:39 crc kubenswrapper[4935]: I1217 09:42:39.735542 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:39 crc kubenswrapper[4935]: I1217 09:42:39.796046 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:40 crc kubenswrapper[4935]: I1217 09:42:40.181661 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:40 crc kubenswrapper[4935]: I1217 09:42:40.241974 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kg62l"] Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.162649 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kg62l" podUID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerName="registry-server" containerID="cri-o://292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618" gracePeriod=2 Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.629316 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.770792 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-catalog-content\") pod \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.770867 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbt6\" (UniqueName: \"kubernetes.io/projected/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-kube-api-access-cfbt6\") pod \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.770921 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-utilities\") pod \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\" (UID: \"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6\") " Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.772574 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-utilities" (OuterVolumeSpecName: "utilities") pod "cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" (UID: "cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.782231 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-kube-api-access-cfbt6" (OuterVolumeSpecName: "kube-api-access-cfbt6") pod "cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" (UID: "cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6"). InnerVolumeSpecName "kube-api-access-cfbt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.800521 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" (UID: "cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.874180 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.874225 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:42:42 crc kubenswrapper[4935]: I1217 09:42:42.874241 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbt6\" (UniqueName: \"kubernetes.io/projected/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6-kube-api-access-cfbt6\") on node \"crc\" DevicePath \"\"" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.124386 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:42:43 crc kubenswrapper[4935]: E1217 09:42:43.124932 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.175231 4935 generic.go:334] "Generic (PLEG): container finished" podID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerID="292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618" exitCode=0 Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.175306 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg62l" event={"ID":"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6","Type":"ContainerDied","Data":"292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618"} Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.175348 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kg62l" event={"ID":"cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6","Type":"ContainerDied","Data":"dc68ac1623bb10f70e8ea5c6675508d560eeb18ad2fbcc41aa6af08bb84d5ace"} Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.175367 4935 scope.go:117] "RemoveContainer" containerID="292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.175316 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kg62l" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.206917 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kg62l"] Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.208311 4935 scope.go:117] "RemoveContainer" containerID="b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.216578 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kg62l"] Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.234668 4935 scope.go:117] "RemoveContainer" containerID="040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.289434 4935 scope.go:117] "RemoveContainer" containerID="292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618" Dec 17 09:42:43 crc kubenswrapper[4935]: E1217 09:42:43.290190 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618\": container with ID starting with 292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618 not found: ID does not exist" containerID="292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.290259 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618"} err="failed to get container status \"292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618\": rpc error: code = NotFound desc = could not find container \"292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618\": container with ID starting with 292f4a89b8d29ddf914d9e69f75de4263892e6f8de813bc1d2d53a2190641618 not found: ID does not exist" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.290327 4935 scope.go:117] "RemoveContainer" containerID="b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7" Dec 17 09:42:43 crc kubenswrapper[4935]: E1217 09:42:43.291186 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7\": container with ID starting with b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7 not found: ID does not exist" containerID="b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.291214 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7"} err="failed to get container status \"b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7\": rpc error: code = NotFound desc = could not find container \"b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7\": container with ID starting with b39449b9fbb6aa289b0c1e856ff3eb6501f52f29ee1edf8b619ae9caa4f18bc7 not found: ID does not exist" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.291232 4935 scope.go:117] "RemoveContainer" containerID="040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c" Dec 17 09:42:43 crc kubenswrapper[4935]: E1217 09:42:43.291933 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c\": container with ID starting with 040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c not found: ID does not exist" containerID="040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c" Dec 17 09:42:43 crc kubenswrapper[4935]: I1217 09:42:43.291981 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c"} err="failed to get container status \"040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c\": rpc error: code = NotFound desc = could not find container \"040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c\": container with ID starting with 040b8eea6e64a786fac1a99fcdeed60e11cec4fb7e88f6ad722250cc11da9c8c not found: ID does not exist" Dec 17 09:42:45 crc kubenswrapper[4935]: I1217 09:42:45.134292 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" path="/var/lib/kubelet/pods/cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6/volumes" Dec 17 09:42:55 crc kubenswrapper[4935]: I1217 09:42:55.124457 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:42:55 crc kubenswrapper[4935]: E1217 09:42:55.125715 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:43:09 crc kubenswrapper[4935]: I1217 09:43:09.124729 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:43:09 crc kubenswrapper[4935]: E1217 09:43:09.125738 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:43:22 crc kubenswrapper[4935]: I1217 09:43:22.125507 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:43:22 crc kubenswrapper[4935]: E1217 09:43:22.126401 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:43:35 crc kubenswrapper[4935]: I1217 09:43:35.124556 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:43:35 crc kubenswrapper[4935]: E1217 09:43:35.125369 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:43:50 crc kubenswrapper[4935]: I1217 09:43:50.124923 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:43:50 crc kubenswrapper[4935]: E1217 09:43:50.125864 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:44:04 crc kubenswrapper[4935]: I1217 09:44:04.124325 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:44:04 crc kubenswrapper[4935]: E1217 09:44:04.125140 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:44:19 crc kubenswrapper[4935]: I1217 09:44:19.125356 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:44:19 crc kubenswrapper[4935]: E1217 09:44:19.126781 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:44:32 crc kubenswrapper[4935]: I1217 09:44:32.124295 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:44:32 crc kubenswrapper[4935]: E1217 09:44:32.124957 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:44:44 crc kubenswrapper[4935]: I1217 09:44:44.125488 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:44:44 crc kubenswrapper[4935]: E1217 09:44:44.126453 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:44:59 crc kubenswrapper[4935]: I1217 09:44:59.124548 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:44:59 crc kubenswrapper[4935]: E1217 09:44:59.125404 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.144207 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq"] Dec 17 09:45:00 crc kubenswrapper[4935]: E1217 09:45:00.144885 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerName="extract-utilities" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.144896 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerName="extract-utilities" Dec 17 09:45:00 crc kubenswrapper[4935]: E1217 09:45:00.144910 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerName="registry-server" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.144917 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerName="registry-server" Dec 17 09:45:00 crc kubenswrapper[4935]: E1217 09:45:00.144928 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerName="extract-content" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.144934 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerName="extract-content" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.145134 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfe2cfda-0259-472f-8b92-3b5ce7bdf8a6" containerName="registry-server" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.145779 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.147615 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.147699 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.153299 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq"] Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.280566 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5948033f-11ea-42cc-86dd-ea9119c06245-config-volume\") pod \"collect-profiles-29432745-wddqq\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.280761 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5948033f-11ea-42cc-86dd-ea9119c06245-secret-volume\") pod \"collect-profiles-29432745-wddqq\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.280814 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4l6r\" (UniqueName: \"kubernetes.io/projected/5948033f-11ea-42cc-86dd-ea9119c06245-kube-api-access-n4l6r\") pod \"collect-profiles-29432745-wddqq\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.383405 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5948033f-11ea-42cc-86dd-ea9119c06245-config-volume\") pod \"collect-profiles-29432745-wddqq\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.383990 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5948033f-11ea-42cc-86dd-ea9119c06245-secret-volume\") pod \"collect-profiles-29432745-wddqq\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.384121 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4l6r\" (UniqueName: \"kubernetes.io/projected/5948033f-11ea-42cc-86dd-ea9119c06245-kube-api-access-n4l6r\") pod \"collect-profiles-29432745-wddqq\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.384434 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5948033f-11ea-42cc-86dd-ea9119c06245-config-volume\") pod \"collect-profiles-29432745-wddqq\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.391044 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5948033f-11ea-42cc-86dd-ea9119c06245-secret-volume\") pod \"collect-profiles-29432745-wddqq\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.401010 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4l6r\" (UniqueName: \"kubernetes.io/projected/5948033f-11ea-42cc-86dd-ea9119c06245-kube-api-access-n4l6r\") pod \"collect-profiles-29432745-wddqq\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.464644 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:00 crc kubenswrapper[4935]: I1217 09:45:00.887860 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq"] Dec 17 09:45:01 crc kubenswrapper[4935]: I1217 09:45:01.444853 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" event={"ID":"5948033f-11ea-42cc-86dd-ea9119c06245","Type":"ContainerStarted","Data":"1f4129d6c294121c3deb4d7d069cae85e7f71053423cc1744be1789b8a69579c"} Dec 17 09:45:02 crc kubenswrapper[4935]: I1217 09:45:02.458606 4935 generic.go:334] "Generic (PLEG): container finished" podID="5948033f-11ea-42cc-86dd-ea9119c06245" containerID="46b5decf22388ed4ec7c01c91cdb9ae7628e986102ad61186ae94348e2ba0475" exitCode=0 Dec 17 09:45:02 crc kubenswrapper[4935]: I1217 09:45:02.458816 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" event={"ID":"5948033f-11ea-42cc-86dd-ea9119c06245","Type":"ContainerDied","Data":"46b5decf22388ed4ec7c01c91cdb9ae7628e986102ad61186ae94348e2ba0475"} Dec 17 09:45:03 crc kubenswrapper[4935]: I1217 09:45:03.851127 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:03 crc kubenswrapper[4935]: I1217 09:45:03.948173 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4l6r\" (UniqueName: \"kubernetes.io/projected/5948033f-11ea-42cc-86dd-ea9119c06245-kube-api-access-n4l6r\") pod \"5948033f-11ea-42cc-86dd-ea9119c06245\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " Dec 17 09:45:03 crc kubenswrapper[4935]: I1217 09:45:03.948258 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5948033f-11ea-42cc-86dd-ea9119c06245-secret-volume\") pod \"5948033f-11ea-42cc-86dd-ea9119c06245\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " Dec 17 09:45:03 crc kubenswrapper[4935]: I1217 09:45:03.948495 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5948033f-11ea-42cc-86dd-ea9119c06245-config-volume\") pod \"5948033f-11ea-42cc-86dd-ea9119c06245\" (UID: \"5948033f-11ea-42cc-86dd-ea9119c06245\") " Dec 17 09:45:03 crc kubenswrapper[4935]: I1217 09:45:03.949440 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5948033f-11ea-42cc-86dd-ea9119c06245-config-volume" (OuterVolumeSpecName: "config-volume") pod "5948033f-11ea-42cc-86dd-ea9119c06245" (UID: "5948033f-11ea-42cc-86dd-ea9119c06245"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:45:03 crc kubenswrapper[4935]: I1217 09:45:03.954221 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5948033f-11ea-42cc-86dd-ea9119c06245-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5948033f-11ea-42cc-86dd-ea9119c06245" (UID: "5948033f-11ea-42cc-86dd-ea9119c06245"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:45:03 crc kubenswrapper[4935]: I1217 09:45:03.955030 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5948033f-11ea-42cc-86dd-ea9119c06245-kube-api-access-n4l6r" (OuterVolumeSpecName: "kube-api-access-n4l6r") pod "5948033f-11ea-42cc-86dd-ea9119c06245" (UID: "5948033f-11ea-42cc-86dd-ea9119c06245"). InnerVolumeSpecName "kube-api-access-n4l6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:45:04 crc kubenswrapper[4935]: I1217 09:45:04.051375 4935 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5948033f-11ea-42cc-86dd-ea9119c06245-config-volume\") on node \"crc\" DevicePath \"\"" Dec 17 09:45:04 crc kubenswrapper[4935]: I1217 09:45:04.051427 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4l6r\" (UniqueName: \"kubernetes.io/projected/5948033f-11ea-42cc-86dd-ea9119c06245-kube-api-access-n4l6r\") on node \"crc\" DevicePath \"\"" Dec 17 09:45:04 crc kubenswrapper[4935]: I1217 09:45:04.051443 4935 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5948033f-11ea-42cc-86dd-ea9119c06245-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 17 09:45:04 crc kubenswrapper[4935]: I1217 09:45:04.489432 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" event={"ID":"5948033f-11ea-42cc-86dd-ea9119c06245","Type":"ContainerDied","Data":"1f4129d6c294121c3deb4d7d069cae85e7f71053423cc1744be1789b8a69579c"} Dec 17 09:45:04 crc kubenswrapper[4935]: I1217 09:45:04.489764 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f4129d6c294121c3deb4d7d069cae85e7f71053423cc1744be1789b8a69579c" Dec 17 09:45:04 crc kubenswrapper[4935]: I1217 09:45:04.489476 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432745-wddqq" Dec 17 09:45:04 crc kubenswrapper[4935]: I1217 09:45:04.919685 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc"] Dec 17 09:45:04 crc kubenswrapper[4935]: I1217 09:45:04.927099 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432700-jt6zc"] Dec 17 09:45:05 crc kubenswrapper[4935]: I1217 09:45:05.137366 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524" path="/var/lib/kubelet/pods/78b8f890-21e9-4ff4-bbbb-dd5e2b2ef524/volumes" Dec 17 09:45:14 crc kubenswrapper[4935]: I1217 09:45:14.124247 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:45:14 crc kubenswrapper[4935]: E1217 09:45:14.125092 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:45:28 crc kubenswrapper[4935]: I1217 09:45:28.125734 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:45:28 crc kubenswrapper[4935]: E1217 09:45:28.127140 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:45:29 crc kubenswrapper[4935]: I1217 09:45:29.729754 4935 generic.go:334] "Generic (PLEG): container finished" podID="61772212-3ef5-4d2a-91be-96cd39dbb4e3" containerID="9ff61d330dd44a3deaadbfab3cafaa1a9ee1e869d963408326cfe837ac885970" exitCode=0 Dec 17 09:45:29 crc kubenswrapper[4935]: I1217 09:45:29.729831 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" event={"ID":"61772212-3ef5-4d2a-91be-96cd39dbb4e3","Type":"ContainerDied","Data":"9ff61d330dd44a3deaadbfab3cafaa1a9ee1e869d963408326cfe837ac885970"} Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.140928 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.250429 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-secret-0\") pod \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.250536 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-ssh-key\") pod \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.250608 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rk4q\" (UniqueName: \"kubernetes.io/projected/61772212-3ef5-4d2a-91be-96cd39dbb4e3-kube-api-access-6rk4q\") pod \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.250936 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-combined-ca-bundle\") pod \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.251012 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-inventory\") pod \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\" (UID: \"61772212-3ef5-4d2a-91be-96cd39dbb4e3\") " Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.259622 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "61772212-3ef5-4d2a-91be-96cd39dbb4e3" (UID: "61772212-3ef5-4d2a-91be-96cd39dbb4e3"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.263717 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61772212-3ef5-4d2a-91be-96cd39dbb4e3-kube-api-access-6rk4q" (OuterVolumeSpecName: "kube-api-access-6rk4q") pod "61772212-3ef5-4d2a-91be-96cd39dbb4e3" (UID: "61772212-3ef5-4d2a-91be-96cd39dbb4e3"). InnerVolumeSpecName "kube-api-access-6rk4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.283606 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-inventory" (OuterVolumeSpecName: "inventory") pod "61772212-3ef5-4d2a-91be-96cd39dbb4e3" (UID: "61772212-3ef5-4d2a-91be-96cd39dbb4e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.286416 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "61772212-3ef5-4d2a-91be-96cd39dbb4e3" (UID: "61772212-3ef5-4d2a-91be-96cd39dbb4e3"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.295443 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "61772212-3ef5-4d2a-91be-96cd39dbb4e3" (UID: "61772212-3ef5-4d2a-91be-96cd39dbb4e3"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.358507 4935 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.358571 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.358588 4935 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.358601 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61772212-3ef5-4d2a-91be-96cd39dbb4e3-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.358615 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rk4q\" (UniqueName: \"kubernetes.io/projected/61772212-3ef5-4d2a-91be-96cd39dbb4e3-kube-api-access-6rk4q\") on node \"crc\" DevicePath \"\"" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.751380 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" event={"ID":"61772212-3ef5-4d2a-91be-96cd39dbb4e3","Type":"ContainerDied","Data":"236b7e26af47f20088251afba13422f48ea3b7116296c0b8f8b0bb2240739362"} Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.751436 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="236b7e26af47f20088251afba13422f48ea3b7116296c0b8f8b0bb2240739362" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.751452 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-znsdv" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.886960 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz"] Dec 17 09:45:31 crc kubenswrapper[4935]: E1217 09:45:31.887349 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5948033f-11ea-42cc-86dd-ea9119c06245" containerName="collect-profiles" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.887362 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5948033f-11ea-42cc-86dd-ea9119c06245" containerName="collect-profiles" Dec 17 09:45:31 crc kubenswrapper[4935]: E1217 09:45:31.887390 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61772212-3ef5-4d2a-91be-96cd39dbb4e3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.887398 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="61772212-3ef5-4d2a-91be-96cd39dbb4e3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.887606 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5948033f-11ea-42cc-86dd-ea9119c06245" containerName="collect-profiles" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.887617 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="61772212-3ef5-4d2a-91be-96cd39dbb4e3" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.888266 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.891107 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.892541 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.892605 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.892707 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.892789 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.892834 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.893036 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.910351 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz"] Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.974994 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.975072 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.975485 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.975618 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.975772 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.976005 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.976134 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.976187 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:31 crc kubenswrapper[4935]: I1217 09:45:31.976313 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqkbp\" (UniqueName: \"kubernetes.io/projected/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-kube-api-access-vqkbp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.078571 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.078651 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.078678 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.078708 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqkbp\" (UniqueName: \"kubernetes.io/projected/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-kube-api-access-vqkbp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.078795 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.078826 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.078885 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.078916 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.078964 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.081142 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.084431 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.084958 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.085371 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.086249 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.086490 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.087892 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.089252 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.101778 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqkbp\" (UniqueName: \"kubernetes.io/projected/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-kube-api-access-vqkbp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-sp6wz\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.209380 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.751471 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz"] Dec 17 09:45:32 crc kubenswrapper[4935]: I1217 09:45:32.767105 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 09:45:33 crc kubenswrapper[4935]: I1217 09:45:33.767001 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" event={"ID":"e1f63f1c-eca8-4e26-ab88-07f61efb54bb","Type":"ContainerStarted","Data":"5f2381635ff20a0488749e7a4848db0ba2ba4ee65db2b4e1c87661f39befcca3"} Dec 17 09:45:34 crc kubenswrapper[4935]: I1217 09:45:34.778896 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" event={"ID":"e1f63f1c-eca8-4e26-ab88-07f61efb54bb","Type":"ContainerStarted","Data":"59e04aa627f9e0fa9249d89f16b2ac52267480c2d846e6ad482d4f2a255b9c62"} Dec 17 09:45:34 crc kubenswrapper[4935]: I1217 09:45:34.808710 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" podStartSLOduration=2.325998536 podStartE2EDuration="3.808675514s" podCreationTimestamp="2025-12-17 09:45:31 +0000 UTC" firstStartedPulling="2025-12-17 09:45:32.766799007 +0000 UTC m=+2452.426639770" lastFinishedPulling="2025-12-17 09:45:34.249475985 +0000 UTC m=+2453.909316748" observedRunningTime="2025-12-17 09:45:34.80523756 +0000 UTC m=+2454.465078333" watchObservedRunningTime="2025-12-17 09:45:34.808675514 +0000 UTC m=+2454.468516277" Dec 17 09:45:42 crc kubenswrapper[4935]: I1217 09:45:42.124096 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:45:42 crc kubenswrapper[4935]: E1217 09:45:42.126079 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:45:45 crc kubenswrapper[4935]: I1217 09:45:45.362452 4935 scope.go:117] "RemoveContainer" containerID="9128e2ca047e1cbe724ab69b1ff38ad86d8c64e89a1d459ad6cb654321db2371" Dec 17 09:45:57 crc kubenswrapper[4935]: I1217 09:45:57.124690 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:45:57 crc kubenswrapper[4935]: E1217 09:45:57.125603 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:46:12 crc kubenswrapper[4935]: I1217 09:46:12.124474 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:46:12 crc kubenswrapper[4935]: E1217 09:46:12.125332 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:46:25 crc kubenswrapper[4935]: I1217 09:46:25.125757 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:46:25 crc kubenswrapper[4935]: E1217 09:46:25.127840 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:46:39 crc kubenswrapper[4935]: I1217 09:46:39.124415 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:46:39 crc kubenswrapper[4935]: E1217 09:46:39.125300 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:46:51 crc kubenswrapper[4935]: I1217 09:46:51.133317 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:46:51 crc kubenswrapper[4935]: E1217 09:46:51.134815 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:47:02 crc kubenswrapper[4935]: I1217 09:47:02.124709 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:47:02 crc kubenswrapper[4935]: E1217 09:47:02.125494 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:47:17 crc kubenswrapper[4935]: I1217 09:47:17.125343 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:47:17 crc kubenswrapper[4935]: E1217 09:47:17.127379 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:47:30 crc kubenswrapper[4935]: I1217 09:47:30.124951 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:47:30 crc kubenswrapper[4935]: E1217 09:47:30.125757 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:47:44 crc kubenswrapper[4935]: I1217 09:47:44.124962 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:47:44 crc kubenswrapper[4935]: I1217 09:47:44.978663 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"bded46501ade12a48f656a8c85415858ab825adaa6120ab69b754451328b7fac"} Dec 17 09:48:29 crc kubenswrapper[4935]: I1217 09:48:29.413176 4935 generic.go:334] "Generic (PLEG): container finished" podID="e1f63f1c-eca8-4e26-ab88-07f61efb54bb" containerID="59e04aa627f9e0fa9249d89f16b2ac52267480c2d846e6ad482d4f2a255b9c62" exitCode=0 Dec 17 09:48:29 crc kubenswrapper[4935]: I1217 09:48:29.413300 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" event={"ID":"e1f63f1c-eca8-4e26-ab88-07f61efb54bb","Type":"ContainerDied","Data":"59e04aa627f9e0fa9249d89f16b2ac52267480c2d846e6ad482d4f2a255b9c62"} Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.837194 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.963358 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-inventory\") pod \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.963487 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-1\") pod \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.963546 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-ssh-key\") pod \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.963627 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-0\") pod \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.963685 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-extra-config-0\") pod \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.963737 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-combined-ca-bundle\") pod \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.963786 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqkbp\" (UniqueName: \"kubernetes.io/projected/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-kube-api-access-vqkbp\") pod \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.963829 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-1\") pod \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.963859 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-0\") pod \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\" (UID: \"e1f63f1c-eca8-4e26-ab88-07f61efb54bb\") " Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.972722 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-kube-api-access-vqkbp" (OuterVolumeSpecName: "kube-api-access-vqkbp") pod "e1f63f1c-eca8-4e26-ab88-07f61efb54bb" (UID: "e1f63f1c-eca8-4e26-ab88-07f61efb54bb"). InnerVolumeSpecName "kube-api-access-vqkbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.985721 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e1f63f1c-eca8-4e26-ab88-07f61efb54bb" (UID: "e1f63f1c-eca8-4e26-ab88-07f61efb54bb"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.991088 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e1f63f1c-eca8-4e26-ab88-07f61efb54bb" (UID: "e1f63f1c-eca8-4e26-ab88-07f61efb54bb"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.995290 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-inventory" (OuterVolumeSpecName: "inventory") pod "e1f63f1c-eca8-4e26-ab88-07f61efb54bb" (UID: "e1f63f1c-eca8-4e26-ab88-07f61efb54bb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.998445 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e1f63f1c-eca8-4e26-ab88-07f61efb54bb" (UID: "e1f63f1c-eca8-4e26-ab88-07f61efb54bb"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.999038 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e1f63f1c-eca8-4e26-ab88-07f61efb54bb" (UID: "e1f63f1c-eca8-4e26-ab88-07f61efb54bb"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:48:30 crc kubenswrapper[4935]: I1217 09:48:30.999563 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e1f63f1c-eca8-4e26-ab88-07f61efb54bb" (UID: "e1f63f1c-eca8-4e26-ab88-07f61efb54bb"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.002864 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e1f63f1c-eca8-4e26-ab88-07f61efb54bb" (UID: "e1f63f1c-eca8-4e26-ab88-07f61efb54bb"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.010990 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e1f63f1c-eca8-4e26-ab88-07f61efb54bb" (UID: "e1f63f1c-eca8-4e26-ab88-07f61efb54bb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.065624 4935 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.065685 4935 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.065697 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqkbp\" (UniqueName: \"kubernetes.io/projected/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-kube-api-access-vqkbp\") on node \"crc\" DevicePath \"\"" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.065706 4935 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.065716 4935 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.065725 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.065733 4935 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.065759 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.065770 4935 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e1f63f1c-eca8-4e26-ab88-07f61efb54bb-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.434230 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" event={"ID":"e1f63f1c-eca8-4e26-ab88-07f61efb54bb","Type":"ContainerDied","Data":"5f2381635ff20a0488749e7a4848db0ba2ba4ee65db2b4e1c87661f39befcca3"} Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.434293 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-sp6wz" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.434300 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f2381635ff20a0488749e7a4848db0ba2ba4ee65db2b4e1c87661f39befcca3" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.620218 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n"] Dec 17 09:48:31 crc kubenswrapper[4935]: E1217 09:48:31.621157 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1f63f1c-eca8-4e26-ab88-07f61efb54bb" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.621184 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1f63f1c-eca8-4e26-ab88-07f61efb54bb" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.621592 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1f63f1c-eca8-4e26-ab88-07f61efb54bb" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.640713 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.643821 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.644052 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.644189 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-q9d8z" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.644412 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.654701 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.683932 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n"] Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.688657 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.688778 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.688872 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.688998 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.689050 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.689138 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.689204 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqws8\" (UniqueName: \"kubernetes.io/projected/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-kube-api-access-mqws8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.790836 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.790930 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.790958 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.791001 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.791033 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqws8\" (UniqueName: \"kubernetes.io/projected/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-kube-api-access-mqws8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.791068 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.791106 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.796589 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.796659 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.800748 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.800815 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.801988 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.807148 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqws8\" (UniqueName: \"kubernetes.io/projected/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-kube-api-access-mqws8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.814648 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:31 crc kubenswrapper[4935]: I1217 09:48:31.985477 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:48:32 crc kubenswrapper[4935]: I1217 09:48:32.576647 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n"] Dec 17 09:48:33 crc kubenswrapper[4935]: I1217 09:48:33.452497 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" event={"ID":"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb","Type":"ContainerStarted","Data":"c91974a9791d7eb08cda670da67e5a26694517ddb1b8a8332568836565fac307"} Dec 17 09:48:34 crc kubenswrapper[4935]: I1217 09:48:34.462126 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" event={"ID":"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb","Type":"ContainerStarted","Data":"5cc44432bf3ae9421ddeecad87e878ac32942d40795a15d8900676c72cae22e2"} Dec 17 09:48:34 crc kubenswrapper[4935]: I1217 09:48:34.485030 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" podStartSLOduration=2.745552519 podStartE2EDuration="3.485006805s" podCreationTimestamp="2025-12-17 09:48:31 +0000 UTC" firstStartedPulling="2025-12-17 09:48:32.577343424 +0000 UTC m=+2632.237184177" lastFinishedPulling="2025-12-17 09:48:33.3167977 +0000 UTC m=+2632.976638463" observedRunningTime="2025-12-17 09:48:34.479969483 +0000 UTC m=+2634.139810246" watchObservedRunningTime="2025-12-17 09:48:34.485006805 +0000 UTC m=+2634.144847568" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.610259 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k6vgm"] Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.613285 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.628173 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6vgm"] Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.762502 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-catalog-content\") pod \"redhat-operators-k6vgm\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.762930 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gfrp\" (UniqueName: \"kubernetes.io/projected/084d26c0-2675-4792-8625-9137b0e41131-kube-api-access-9gfrp\") pod \"redhat-operators-k6vgm\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.763513 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-utilities\") pod \"redhat-operators-k6vgm\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.866036 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-utilities\") pod \"redhat-operators-k6vgm\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.866259 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-catalog-content\") pod \"redhat-operators-k6vgm\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.866316 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gfrp\" (UniqueName: \"kubernetes.io/projected/084d26c0-2675-4792-8625-9137b0e41131-kube-api-access-9gfrp\") pod \"redhat-operators-k6vgm\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.866641 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-utilities\") pod \"redhat-operators-k6vgm\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.866779 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-catalog-content\") pod \"redhat-operators-k6vgm\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.901263 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gfrp\" (UniqueName: \"kubernetes.io/projected/084d26c0-2675-4792-8625-9137b0e41131-kube-api-access-9gfrp\") pod \"redhat-operators-k6vgm\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:39 crc kubenswrapper[4935]: I1217 09:48:39.941439 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:40 crc kubenswrapper[4935]: I1217 09:48:40.493717 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6vgm"] Dec 17 09:48:40 crc kubenswrapper[4935]: I1217 09:48:40.542212 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6vgm" event={"ID":"084d26c0-2675-4792-8625-9137b0e41131","Type":"ContainerStarted","Data":"1b96b1a3241b1823ce705f9d23630afa8b61675dc60c682faebab2d70506fd4c"} Dec 17 09:48:41 crc kubenswrapper[4935]: I1217 09:48:41.555582 4935 generic.go:334] "Generic (PLEG): container finished" podID="084d26c0-2675-4792-8625-9137b0e41131" containerID="b6104f71789c67a1be99dec930f2bd3dd8effd5a5919a487d084e2d09231c9c2" exitCode=0 Dec 17 09:48:41 crc kubenswrapper[4935]: I1217 09:48:41.555990 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6vgm" event={"ID":"084d26c0-2675-4792-8625-9137b0e41131","Type":"ContainerDied","Data":"b6104f71789c67a1be99dec930f2bd3dd8effd5a5919a487d084e2d09231c9c2"} Dec 17 09:48:42 crc kubenswrapper[4935]: I1217 09:48:42.568662 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6vgm" event={"ID":"084d26c0-2675-4792-8625-9137b0e41131","Type":"ContainerStarted","Data":"464d0ee8ba2d0b247ece2e7374c9b75cd7bb097b683b90fa3d4962de690e5f17"} Dec 17 09:48:45 crc kubenswrapper[4935]: I1217 09:48:45.604540 4935 generic.go:334] "Generic (PLEG): container finished" podID="084d26c0-2675-4792-8625-9137b0e41131" containerID="464d0ee8ba2d0b247ece2e7374c9b75cd7bb097b683b90fa3d4962de690e5f17" exitCode=0 Dec 17 09:48:45 crc kubenswrapper[4935]: I1217 09:48:45.604647 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6vgm" event={"ID":"084d26c0-2675-4792-8625-9137b0e41131","Type":"ContainerDied","Data":"464d0ee8ba2d0b247ece2e7374c9b75cd7bb097b683b90fa3d4962de690e5f17"} Dec 17 09:48:47 crc kubenswrapper[4935]: I1217 09:48:47.627582 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6vgm" event={"ID":"084d26c0-2675-4792-8625-9137b0e41131","Type":"ContainerStarted","Data":"cc0c53e564190e27d8ce00042b893a1b38c65bf1b2acd37ec97980e01f1085f3"} Dec 17 09:48:47 crc kubenswrapper[4935]: I1217 09:48:47.666033 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k6vgm" podStartSLOduration=3.5435866430000003 podStartE2EDuration="8.666008563s" podCreationTimestamp="2025-12-17 09:48:39 +0000 UTC" firstStartedPulling="2025-12-17 09:48:41.557914158 +0000 UTC m=+2641.217754921" lastFinishedPulling="2025-12-17 09:48:46.680336078 +0000 UTC m=+2646.340176841" observedRunningTime="2025-12-17 09:48:47.649413298 +0000 UTC m=+2647.309254071" watchObservedRunningTime="2025-12-17 09:48:47.666008563 +0000 UTC m=+2647.325849326" Dec 17 09:48:49 crc kubenswrapper[4935]: I1217 09:48:49.942660 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:49 crc kubenswrapper[4935]: I1217 09:48:49.943354 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:48:51 crc kubenswrapper[4935]: I1217 09:48:51.003445 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6vgm" podUID="084d26c0-2675-4792-8625-9137b0e41131" containerName="registry-server" probeResult="failure" output=< Dec 17 09:48:51 crc kubenswrapper[4935]: timeout: failed to connect service ":50051" within 1s Dec 17 09:48:51 crc kubenswrapper[4935]: > Dec 17 09:48:59 crc kubenswrapper[4935]: I1217 09:48:59.995298 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:49:00 crc kubenswrapper[4935]: I1217 09:49:00.057551 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:49:00 crc kubenswrapper[4935]: I1217 09:49:00.247910 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6vgm"] Dec 17 09:49:01 crc kubenswrapper[4935]: I1217 09:49:01.775726 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k6vgm" podUID="084d26c0-2675-4792-8625-9137b0e41131" containerName="registry-server" containerID="cri-o://cc0c53e564190e27d8ce00042b893a1b38c65bf1b2acd37ec97980e01f1085f3" gracePeriod=2 Dec 17 09:49:02 crc kubenswrapper[4935]: I1217 09:49:02.788686 4935 generic.go:334] "Generic (PLEG): container finished" podID="084d26c0-2675-4792-8625-9137b0e41131" containerID="cc0c53e564190e27d8ce00042b893a1b38c65bf1b2acd37ec97980e01f1085f3" exitCode=0 Dec 17 09:49:02 crc kubenswrapper[4935]: I1217 09:49:02.788734 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6vgm" event={"ID":"084d26c0-2675-4792-8625-9137b0e41131","Type":"ContainerDied","Data":"cc0c53e564190e27d8ce00042b893a1b38c65bf1b2acd37ec97980e01f1085f3"} Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.356842 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.547122 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-utilities\") pod \"084d26c0-2675-4792-8625-9137b0e41131\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.548004 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-utilities" (OuterVolumeSpecName: "utilities") pod "084d26c0-2675-4792-8625-9137b0e41131" (UID: "084d26c0-2675-4792-8625-9137b0e41131"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.548127 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gfrp\" (UniqueName: \"kubernetes.io/projected/084d26c0-2675-4792-8625-9137b0e41131-kube-api-access-9gfrp\") pod \"084d26c0-2675-4792-8625-9137b0e41131\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.549003 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-catalog-content\") pod \"084d26c0-2675-4792-8625-9137b0e41131\" (UID: \"084d26c0-2675-4792-8625-9137b0e41131\") " Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.550696 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.554513 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/084d26c0-2675-4792-8625-9137b0e41131-kube-api-access-9gfrp" (OuterVolumeSpecName: "kube-api-access-9gfrp") pod "084d26c0-2675-4792-8625-9137b0e41131" (UID: "084d26c0-2675-4792-8625-9137b0e41131"). InnerVolumeSpecName "kube-api-access-9gfrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.653418 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9gfrp\" (UniqueName: \"kubernetes.io/projected/084d26c0-2675-4792-8625-9137b0e41131-kube-api-access-9gfrp\") on node \"crc\" DevicePath \"\"" Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.664327 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "084d26c0-2675-4792-8625-9137b0e41131" (UID: "084d26c0-2675-4792-8625-9137b0e41131"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.755677 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/084d26c0-2675-4792-8625-9137b0e41131-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.803484 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6vgm" event={"ID":"084d26c0-2675-4792-8625-9137b0e41131","Type":"ContainerDied","Data":"1b96b1a3241b1823ce705f9d23630afa8b61675dc60c682faebab2d70506fd4c"} Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.803553 4935 scope.go:117] "RemoveContainer" containerID="cc0c53e564190e27d8ce00042b893a1b38c65bf1b2acd37ec97980e01f1085f3" Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.803554 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6vgm" Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.834260 4935 scope.go:117] "RemoveContainer" containerID="464d0ee8ba2d0b247ece2e7374c9b75cd7bb097b683b90fa3d4962de690e5f17" Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.860406 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6vgm"] Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.872826 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k6vgm"] Dec 17 09:49:03 crc kubenswrapper[4935]: I1217 09:49:03.882219 4935 scope.go:117] "RemoveContainer" containerID="b6104f71789c67a1be99dec930f2bd3dd8effd5a5919a487d084e2d09231c9c2" Dec 17 09:49:05 crc kubenswrapper[4935]: I1217 09:49:05.137770 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="084d26c0-2675-4792-8625-9137b0e41131" path="/var/lib/kubelet/pods/084d26c0-2675-4792-8625-9137b0e41131/volumes" Dec 17 09:50:00 crc kubenswrapper[4935]: I1217 09:50:00.135735 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:50:00 crc kubenswrapper[4935]: I1217 09:50:00.141749 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:50:30 crc kubenswrapper[4935]: I1217 09:50:30.130578 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:50:30 crc kubenswrapper[4935]: I1217 09:50:30.131601 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.687806 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-74ll4"] Dec 17 09:50:58 crc kubenswrapper[4935]: E1217 09:50:58.689361 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084d26c0-2675-4792-8625-9137b0e41131" containerName="extract-utilities" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.689376 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="084d26c0-2675-4792-8625-9137b0e41131" containerName="extract-utilities" Dec 17 09:50:58 crc kubenswrapper[4935]: E1217 09:50:58.689383 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084d26c0-2675-4792-8625-9137b0e41131" containerName="extract-content" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.689389 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="084d26c0-2675-4792-8625-9137b0e41131" containerName="extract-content" Dec 17 09:50:58 crc kubenswrapper[4935]: E1217 09:50:58.689408 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084d26c0-2675-4792-8625-9137b0e41131" containerName="registry-server" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.689414 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="084d26c0-2675-4792-8625-9137b0e41131" containerName="registry-server" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.689674 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="084d26c0-2675-4792-8625-9137b0e41131" containerName="registry-server" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.691158 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.702164 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-74ll4"] Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.835888 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr249\" (UniqueName: \"kubernetes.io/projected/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-kube-api-access-gr249\") pod \"community-operators-74ll4\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.835968 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-catalog-content\") pod \"community-operators-74ll4\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.836169 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-utilities\") pod \"community-operators-74ll4\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.937987 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-catalog-content\") pod \"community-operators-74ll4\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.938059 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-utilities\") pod \"community-operators-74ll4\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.938189 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr249\" (UniqueName: \"kubernetes.io/projected/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-kube-api-access-gr249\") pod \"community-operators-74ll4\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.938667 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-catalog-content\") pod \"community-operators-74ll4\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.938974 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-utilities\") pod \"community-operators-74ll4\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:58 crc kubenswrapper[4935]: I1217 09:50:58.966870 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr249\" (UniqueName: \"kubernetes.io/projected/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-kube-api-access-gr249\") pod \"community-operators-74ll4\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:59 crc kubenswrapper[4935]: I1217 09:50:59.017867 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:50:59 crc kubenswrapper[4935]: I1217 09:50:59.590307 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-74ll4"] Dec 17 09:50:59 crc kubenswrapper[4935]: I1217 09:50:59.997169 4935 generic.go:334] "Generic (PLEG): container finished" podID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerID="4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab" exitCode=0 Dec 17 09:50:59 crc kubenswrapper[4935]: I1217 09:50:59.997239 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74ll4" event={"ID":"2a905f72-2ff2-44c7-afe8-08e75ceb22bc","Type":"ContainerDied","Data":"4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab"} Dec 17 09:50:59 crc kubenswrapper[4935]: I1217 09:50:59.997585 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74ll4" event={"ID":"2a905f72-2ff2-44c7-afe8-08e75ceb22bc","Type":"ContainerStarted","Data":"abf8a29f84de1bc8a9418bdde7db4639bce0c31f0839d99ada9cfd8ac44c7f60"} Dec 17 09:50:59 crc kubenswrapper[4935]: I1217 09:50:59.999172 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 09:51:00 crc kubenswrapper[4935]: I1217 09:51:00.130964 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:51:00 crc kubenswrapper[4935]: I1217 09:51:00.131023 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:51:00 crc kubenswrapper[4935]: I1217 09:51:00.131060 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:51:01 crc kubenswrapper[4935]: I1217 09:51:01.011242 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bded46501ade12a48f656a8c85415858ab825adaa6120ab69b754451328b7fac"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:51:01 crc kubenswrapper[4935]: I1217 09:51:01.011762 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://bded46501ade12a48f656a8c85415858ab825adaa6120ab69b754451328b7fac" gracePeriod=600 Dec 17 09:51:02 crc kubenswrapper[4935]: I1217 09:51:02.022040 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="bded46501ade12a48f656a8c85415858ab825adaa6120ab69b754451328b7fac" exitCode=0 Dec 17 09:51:02 crc kubenswrapper[4935]: I1217 09:51:02.022545 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"bded46501ade12a48f656a8c85415858ab825adaa6120ab69b754451328b7fac"} Dec 17 09:51:02 crc kubenswrapper[4935]: I1217 09:51:02.022585 4935 scope.go:117] "RemoveContainer" containerID="e09e28986b2b563db19438b38be3aa79e04d01d2afea3deeed18390428689e2c" Dec 17 09:51:02 crc kubenswrapper[4935]: I1217 09:51:02.028800 4935 generic.go:334] "Generic (PLEG): container finished" podID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerID="cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6" exitCode=0 Dec 17 09:51:02 crc kubenswrapper[4935]: I1217 09:51:02.028854 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74ll4" event={"ID":"2a905f72-2ff2-44c7-afe8-08e75ceb22bc","Type":"ContainerDied","Data":"cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6"} Dec 17 09:51:03 crc kubenswrapper[4935]: I1217 09:51:03.040463 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092"} Dec 17 09:51:04 crc kubenswrapper[4935]: I1217 09:51:04.053637 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74ll4" event={"ID":"2a905f72-2ff2-44c7-afe8-08e75ceb22bc","Type":"ContainerStarted","Data":"a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5"} Dec 17 09:51:04 crc kubenswrapper[4935]: I1217 09:51:04.078852 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-74ll4" podStartSLOduration=2.671011609 podStartE2EDuration="6.078832605s" podCreationTimestamp="2025-12-17 09:50:58 +0000 UTC" firstStartedPulling="2025-12-17 09:50:59.998933812 +0000 UTC m=+2779.658774575" lastFinishedPulling="2025-12-17 09:51:03.406754808 +0000 UTC m=+2783.066595571" observedRunningTime="2025-12-17 09:51:04.078423355 +0000 UTC m=+2783.738264118" watchObservedRunningTime="2025-12-17 09:51:04.078832605 +0000 UTC m=+2783.738673368" Dec 17 09:51:09 crc kubenswrapper[4935]: I1217 09:51:09.018542 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:51:09 crc kubenswrapper[4935]: I1217 09:51:09.019449 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:51:09 crc kubenswrapper[4935]: I1217 09:51:09.081627 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:51:09 crc kubenswrapper[4935]: I1217 09:51:09.150630 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:51:09 crc kubenswrapper[4935]: I1217 09:51:09.329740 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-74ll4"] Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.116427 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-74ll4" podUID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerName="registry-server" containerID="cri-o://a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5" gracePeriod=2 Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.649748 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.741588 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-utilities\") pod \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.741695 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-catalog-content\") pod \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.741732 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr249\" (UniqueName: \"kubernetes.io/projected/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-kube-api-access-gr249\") pod \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\" (UID: \"2a905f72-2ff2-44c7-afe8-08e75ceb22bc\") " Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.742707 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-utilities" (OuterVolumeSpecName: "utilities") pod "2a905f72-2ff2-44c7-afe8-08e75ceb22bc" (UID: "2a905f72-2ff2-44c7-afe8-08e75ceb22bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.751671 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-kube-api-access-gr249" (OuterVolumeSpecName: "kube-api-access-gr249") pod "2a905f72-2ff2-44c7-afe8-08e75ceb22bc" (UID: "2a905f72-2ff2-44c7-afe8-08e75ceb22bc"). InnerVolumeSpecName "kube-api-access-gr249". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.814079 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a905f72-2ff2-44c7-afe8-08e75ceb22bc" (UID: "2a905f72-2ff2-44c7-afe8-08e75ceb22bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.845264 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.845328 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr249\" (UniqueName: \"kubernetes.io/projected/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-kube-api-access-gr249\") on node \"crc\" DevicePath \"\"" Dec 17 09:51:11 crc kubenswrapper[4935]: I1217 09:51:11.845342 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a905f72-2ff2-44c7-afe8-08e75ceb22bc-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.131355 4935 generic.go:334] "Generic (PLEG): container finished" podID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerID="a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5" exitCode=0 Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.131434 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74ll4" event={"ID":"2a905f72-2ff2-44c7-afe8-08e75ceb22bc","Type":"ContainerDied","Data":"a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5"} Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.131470 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-74ll4" event={"ID":"2a905f72-2ff2-44c7-afe8-08e75ceb22bc","Type":"ContainerDied","Data":"abf8a29f84de1bc8a9418bdde7db4639bce0c31f0839d99ada9cfd8ac44c7f60"} Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.131495 4935 scope.go:117] "RemoveContainer" containerID="a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5" Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.131516 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-74ll4" Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.184631 4935 scope.go:117] "RemoveContainer" containerID="cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6" Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.187559 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-74ll4"] Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.196475 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-74ll4"] Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.229382 4935 scope.go:117] "RemoveContainer" containerID="4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab" Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.268011 4935 scope.go:117] "RemoveContainer" containerID="a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5" Dec 17 09:51:12 crc kubenswrapper[4935]: E1217 09:51:12.268571 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5\": container with ID starting with a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5 not found: ID does not exist" containerID="a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5" Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.268643 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5"} err="failed to get container status \"a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5\": rpc error: code = NotFound desc = could not find container \"a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5\": container with ID starting with a9de5e541cfdcef61ced4de2fda099f8373428fa36c22e0a23b5eb3ef6e8f9e5 not found: ID does not exist" Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.268705 4935 scope.go:117] "RemoveContainer" containerID="cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6" Dec 17 09:51:12 crc kubenswrapper[4935]: E1217 09:51:12.269023 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6\": container with ID starting with cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6 not found: ID does not exist" containerID="cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6" Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.269077 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6"} err="failed to get container status \"cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6\": rpc error: code = NotFound desc = could not find container \"cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6\": container with ID starting with cee286c883f1d6aa82cac9ac12267ea4a29db5a5bc4d5146d0f21106642968b6 not found: ID does not exist" Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.269098 4935 scope.go:117] "RemoveContainer" containerID="4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab" Dec 17 09:51:12 crc kubenswrapper[4935]: E1217 09:51:12.269366 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab\": container with ID starting with 4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab not found: ID does not exist" containerID="4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab" Dec 17 09:51:12 crc kubenswrapper[4935]: I1217 09:51:12.269419 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab"} err="failed to get container status \"4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab\": rpc error: code = NotFound desc = could not find container \"4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab\": container with ID starting with 4a2cd1eab05cdf427e808183f5f44a33dbf08b4d583987e4f4d2ce3d3891d2ab not found: ID does not exist" Dec 17 09:51:13 crc kubenswrapper[4935]: I1217 09:51:13.137366 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" path="/var/lib/kubelet/pods/2a905f72-2ff2-44c7-afe8-08e75ceb22bc/volumes" Dec 17 09:51:42 crc kubenswrapper[4935]: I1217 09:51:42.473995 4935 generic.go:334] "Generic (PLEG): container finished" podID="ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" containerID="5cc44432bf3ae9421ddeecad87e878ac32942d40795a15d8900676c72cae22e2" exitCode=0 Dec 17 09:51:42 crc kubenswrapper[4935]: I1217 09:51:42.474074 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" event={"ID":"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb","Type":"ContainerDied","Data":"5cc44432bf3ae9421ddeecad87e878ac32942d40795a15d8900676c72cae22e2"} Dec 17 09:51:43 crc kubenswrapper[4935]: I1217 09:51:43.956695 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.033420 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-0\") pod \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.033621 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-telemetry-combined-ca-bundle\") pod \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.033652 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqws8\" (UniqueName: \"kubernetes.io/projected/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-kube-api-access-mqws8\") pod \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.033676 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-1\") pod \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.033816 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ssh-key\") pod \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.033835 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-inventory\") pod \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.033888 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-2\") pod \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\" (UID: \"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb\") " Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.041791 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-kube-api-access-mqws8" (OuterVolumeSpecName: "kube-api-access-mqws8") pod "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" (UID: "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb"). InnerVolumeSpecName "kube-api-access-mqws8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.043793 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" (UID: "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.063104 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" (UID: "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.063416 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" (UID: "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.063875 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" (UID: "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.071809 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" (UID: "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.076776 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-inventory" (OuterVolumeSpecName: "inventory") pod "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" (UID: "ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.135430 4935 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-inventory\") on node \"crc\" DevicePath \"\"" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.135452 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.135461 4935 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.135474 4935 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.135485 4935 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.135495 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqws8\" (UniqueName: \"kubernetes.io/projected/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-kube-api-access-mqws8\") on node \"crc\" DevicePath \"\"" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.135504 4935 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.502122 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" event={"ID":"ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb","Type":"ContainerDied","Data":"c91974a9791d7eb08cda670da67e5a26694517ddb1b8a8332568836565fac307"} Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.502189 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c91974a9791d7eb08cda670da67e5a26694517ddb1b8a8332568836565fac307" Dec 17 09:51:44 crc kubenswrapper[4935]: I1217 09:51:44.502452 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.418012 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-65ptb"] Dec 17 09:51:54 crc kubenswrapper[4935]: E1217 09:51:54.418925 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerName="extract-utilities" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.418942 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerName="extract-utilities" Dec 17 09:51:54 crc kubenswrapper[4935]: E1217 09:51:54.418965 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerName="registry-server" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.418971 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerName="registry-server" Dec 17 09:51:54 crc kubenswrapper[4935]: E1217 09:51:54.418985 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerName="extract-content" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.418991 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerName="extract-content" Dec 17 09:51:54 crc kubenswrapper[4935]: E1217 09:51:54.419005 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.419013 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.419196 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a905f72-2ff2-44c7-afe8-08e75ceb22bc" containerName="registry-server" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.419210 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.422136 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.432855 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65ptb"] Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.476769 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-utilities\") pod \"certified-operators-65ptb\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.476831 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jlqw\" (UniqueName: \"kubernetes.io/projected/36d57bb6-604e-4eea-abe8-5ee9b345473c-kube-api-access-9jlqw\") pod \"certified-operators-65ptb\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.476976 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-catalog-content\") pod \"certified-operators-65ptb\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.578686 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-catalog-content\") pod \"certified-operators-65ptb\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.578843 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-utilities\") pod \"certified-operators-65ptb\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.579338 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jlqw\" (UniqueName: \"kubernetes.io/projected/36d57bb6-604e-4eea-abe8-5ee9b345473c-kube-api-access-9jlqw\") pod \"certified-operators-65ptb\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.579459 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-catalog-content\") pod \"certified-operators-65ptb\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.579520 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-utilities\") pod \"certified-operators-65ptb\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.602628 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jlqw\" (UniqueName: \"kubernetes.io/projected/36d57bb6-604e-4eea-abe8-5ee9b345473c-kube-api-access-9jlqw\") pod \"certified-operators-65ptb\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:54 crc kubenswrapper[4935]: I1217 09:51:54.747453 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:51:55 crc kubenswrapper[4935]: I1217 09:51:55.278027 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-65ptb"] Dec 17 09:51:55 crc kubenswrapper[4935]: I1217 09:51:55.623839 4935 generic.go:334] "Generic (PLEG): container finished" podID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerID="dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989" exitCode=0 Dec 17 09:51:55 crc kubenswrapper[4935]: I1217 09:51:55.624025 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65ptb" event={"ID":"36d57bb6-604e-4eea-abe8-5ee9b345473c","Type":"ContainerDied","Data":"dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989"} Dec 17 09:51:55 crc kubenswrapper[4935]: I1217 09:51:55.624391 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65ptb" event={"ID":"36d57bb6-604e-4eea-abe8-5ee9b345473c","Type":"ContainerStarted","Data":"ec137c86f50fbcee4c9231966d4711d91d07ec49a0836a26bc8fb69e25a051b7"} Dec 17 09:51:57 crc kubenswrapper[4935]: I1217 09:51:57.644769 4935 generic.go:334] "Generic (PLEG): container finished" podID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerID="399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07" exitCode=0 Dec 17 09:51:57 crc kubenswrapper[4935]: I1217 09:51:57.644852 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65ptb" event={"ID":"36d57bb6-604e-4eea-abe8-5ee9b345473c","Type":"ContainerDied","Data":"399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07"} Dec 17 09:51:58 crc kubenswrapper[4935]: I1217 09:51:58.661450 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65ptb" event={"ID":"36d57bb6-604e-4eea-abe8-5ee9b345473c","Type":"ContainerStarted","Data":"ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840"} Dec 17 09:51:58 crc kubenswrapper[4935]: I1217 09:51:58.696155 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-65ptb" podStartSLOduration=2.123460853 podStartE2EDuration="4.696125168s" podCreationTimestamp="2025-12-17 09:51:54 +0000 UTC" firstStartedPulling="2025-12-17 09:51:55.627219161 +0000 UTC m=+2835.287059924" lastFinishedPulling="2025-12-17 09:51:58.199883476 +0000 UTC m=+2837.859724239" observedRunningTime="2025-12-17 09:51:58.687383364 +0000 UTC m=+2838.347224137" watchObservedRunningTime="2025-12-17 09:51:58.696125168 +0000 UTC m=+2838.355965931" Dec 17 09:52:04 crc kubenswrapper[4935]: I1217 09:52:04.748011 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:52:04 crc kubenswrapper[4935]: I1217 09:52:04.748917 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:52:04 crc kubenswrapper[4935]: I1217 09:52:04.804671 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:52:05 crc kubenswrapper[4935]: I1217 09:52:05.834539 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:52:05 crc kubenswrapper[4935]: I1217 09:52:05.903793 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-65ptb"] Dec 17 09:52:07 crc kubenswrapper[4935]: I1217 09:52:07.796729 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-65ptb" podUID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerName="registry-server" containerID="cri-o://ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840" gracePeriod=2 Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.802401 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.821203 4935 generic.go:334] "Generic (PLEG): container finished" podID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerID="ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840" exitCode=0 Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.821334 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65ptb" event={"ID":"36d57bb6-604e-4eea-abe8-5ee9b345473c","Type":"ContainerDied","Data":"ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840"} Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.821373 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-65ptb" event={"ID":"36d57bb6-604e-4eea-abe8-5ee9b345473c","Type":"ContainerDied","Data":"ec137c86f50fbcee4c9231966d4711d91d07ec49a0836a26bc8fb69e25a051b7"} Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.821391 4935 scope.go:117] "RemoveContainer" containerID="ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.821551 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-65ptb" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.850659 4935 scope.go:117] "RemoveContainer" containerID="399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.880694 4935 scope.go:117] "RemoveContainer" containerID="dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.938505 4935 scope.go:117] "RemoveContainer" containerID="ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840" Dec 17 09:52:08 crc kubenswrapper[4935]: E1217 09:52:08.939226 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840\": container with ID starting with ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840 not found: ID does not exist" containerID="ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.939331 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840"} err="failed to get container status \"ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840\": rpc error: code = NotFound desc = could not find container \"ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840\": container with ID starting with ad128b2907376055f89389eb018e32cc4c61ff85b52a90e7bb47fab98890e840 not found: ID does not exist" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.939383 4935 scope.go:117] "RemoveContainer" containerID="399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07" Dec 17 09:52:08 crc kubenswrapper[4935]: E1217 09:52:08.940411 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07\": container with ID starting with 399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07 not found: ID does not exist" containerID="399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.940514 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07"} err="failed to get container status \"399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07\": rpc error: code = NotFound desc = could not find container \"399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07\": container with ID starting with 399b3fccb5a92ce3adf8688cd7d79f82b49016a8a4d0dc2cec79e63248a45a07 not found: ID does not exist" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.940557 4935 scope.go:117] "RemoveContainer" containerID="dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989" Dec 17 09:52:08 crc kubenswrapper[4935]: E1217 09:52:08.941146 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989\": container with ID starting with dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989 not found: ID does not exist" containerID="dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.941181 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989"} err="failed to get container status \"dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989\": rpc error: code = NotFound desc = could not find container \"dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989\": container with ID starting with dcc7c46451b697e17a616362daa374ff78dbab8adfb3b06e1180e7d1afbfa989 not found: ID does not exist" Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.993853 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jlqw\" (UniqueName: \"kubernetes.io/projected/36d57bb6-604e-4eea-abe8-5ee9b345473c-kube-api-access-9jlqw\") pod \"36d57bb6-604e-4eea-abe8-5ee9b345473c\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.994059 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-catalog-content\") pod \"36d57bb6-604e-4eea-abe8-5ee9b345473c\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.994156 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-utilities\") pod \"36d57bb6-604e-4eea-abe8-5ee9b345473c\" (UID: \"36d57bb6-604e-4eea-abe8-5ee9b345473c\") " Dec 17 09:52:08 crc kubenswrapper[4935]: I1217 09:52:08.996056 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-utilities" (OuterVolumeSpecName: "utilities") pod "36d57bb6-604e-4eea-abe8-5ee9b345473c" (UID: "36d57bb6-604e-4eea-abe8-5ee9b345473c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:52:09 crc kubenswrapper[4935]: I1217 09:52:09.005504 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d57bb6-604e-4eea-abe8-5ee9b345473c-kube-api-access-9jlqw" (OuterVolumeSpecName: "kube-api-access-9jlqw") pod "36d57bb6-604e-4eea-abe8-5ee9b345473c" (UID: "36d57bb6-604e-4eea-abe8-5ee9b345473c"). InnerVolumeSpecName "kube-api-access-9jlqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:52:09 crc kubenswrapper[4935]: I1217 09:52:09.057194 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36d57bb6-604e-4eea-abe8-5ee9b345473c" (UID: "36d57bb6-604e-4eea-abe8-5ee9b345473c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:52:09 crc kubenswrapper[4935]: I1217 09:52:09.097706 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jlqw\" (UniqueName: \"kubernetes.io/projected/36d57bb6-604e-4eea-abe8-5ee9b345473c-kube-api-access-9jlqw\") on node \"crc\" DevicePath \"\"" Dec 17 09:52:09 crc kubenswrapper[4935]: I1217 09:52:09.097748 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:52:09 crc kubenswrapper[4935]: I1217 09:52:09.097759 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d57bb6-604e-4eea-abe8-5ee9b345473c-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:52:09 crc kubenswrapper[4935]: I1217 09:52:09.164936 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-65ptb"] Dec 17 09:52:09 crc kubenswrapper[4935]: I1217 09:52:09.174617 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-65ptb"] Dec 17 09:52:11 crc kubenswrapper[4935]: I1217 09:52:11.139179 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d57bb6-604e-4eea-abe8-5ee9b345473c" path="/var/lib/kubelet/pods/36d57bb6-604e-4eea-abe8-5ee9b345473c/volumes" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.204671 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 17 09:52:32 crc kubenswrapper[4935]: E1217 09:52:32.206204 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerName="extract-utilities" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.206225 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerName="extract-utilities" Dec 17 09:52:32 crc kubenswrapper[4935]: E1217 09:52:32.206262 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerName="registry-server" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.206267 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerName="registry-server" Dec 17 09:52:32 crc kubenswrapper[4935]: E1217 09:52:32.206293 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerName="extract-content" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.206300 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerName="extract-content" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.206523 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d57bb6-604e-4eea-abe8-5ee9b345473c" containerName="registry-server" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.207219 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.213616 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.213895 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.214045 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.214175 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-45q57" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.216183 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.324787 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.324916 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.324960 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.325005 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.325086 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.325106 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.325131 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.325158 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgdww\" (UniqueName: \"kubernetes.io/projected/36307e10-5953-420e-9627-2812d493abea-kube-api-access-qgdww\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.325175 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-config-data\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.426733 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.426784 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.426818 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.426846 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-config-data\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.426862 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgdww\" (UniqueName: \"kubernetes.io/projected/36307e10-5953-420e-9627-2812d493abea-kube-api-access-qgdww\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.426908 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.426987 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.427029 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.427068 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.428461 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.428733 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.428996 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.429863 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.429916 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-config-data\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.435972 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.436093 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.446470 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.447467 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgdww\" (UniqueName: \"kubernetes.io/projected/36307e10-5953-420e-9627-2812d493abea-kube-api-access-qgdww\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.463297 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.532359 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 17 09:52:32 crc kubenswrapper[4935]: I1217 09:52:32.969397 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 17 09:52:33 crc kubenswrapper[4935]: I1217 09:52:33.062226 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"36307e10-5953-420e-9627-2812d493abea","Type":"ContainerStarted","Data":"1eb739047a68985fcf51bb544f07915f6d31552bc129d769c10603983ff0c231"} Dec 17 09:53:13 crc kubenswrapper[4935]: E1217 09:53:13.371110 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 17 09:53:13 crc kubenswrapper[4935]: E1217 09:53:13.372689 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qgdww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(36307e10-5953-420e-9627-2812d493abea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 09:53:13 crc kubenswrapper[4935]: E1217 09:53:13.373980 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="36307e10-5953-420e-9627-2812d493abea" Dec 17 09:53:13 crc kubenswrapper[4935]: E1217 09:53:13.451356 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="36307e10-5953-420e-9627-2812d493abea" Dec 17 09:53:27 crc kubenswrapper[4935]: I1217 09:53:27.590431 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 17 09:53:29 crc kubenswrapper[4935]: I1217 09:53:29.601615 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"36307e10-5953-420e-9627-2812d493abea","Type":"ContainerStarted","Data":"bd7ced6a6394544aca77363a42f4708768ece2ecce1846445befd480020521ee"} Dec 17 09:53:29 crc kubenswrapper[4935]: I1217 09:53:29.634705 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.038796028 podStartE2EDuration="58.634678022s" podCreationTimestamp="2025-12-17 09:52:31 +0000 UTC" firstStartedPulling="2025-12-17 09:52:32.99132729 +0000 UTC m=+2872.651168053" lastFinishedPulling="2025-12-17 09:53:27.587209284 +0000 UTC m=+2927.247050047" observedRunningTime="2025-12-17 09:53:29.630900209 +0000 UTC m=+2929.290740972" watchObservedRunningTime="2025-12-17 09:53:29.634678022 +0000 UTC m=+2929.294518815" Dec 17 09:53:30 crc kubenswrapper[4935]: I1217 09:53:30.131030 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:53:30 crc kubenswrapper[4935]: I1217 09:53:30.131143 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:54:00 crc kubenswrapper[4935]: I1217 09:54:00.130594 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:54:00 crc kubenswrapper[4935]: I1217 09:54:00.131376 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:54:30 crc kubenswrapper[4935]: I1217 09:54:30.131839 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 09:54:30 crc kubenswrapper[4935]: I1217 09:54:30.133076 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 09:54:30 crc kubenswrapper[4935]: I1217 09:54:30.133128 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 09:54:30 crc kubenswrapper[4935]: I1217 09:54:30.134062 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 09:54:30 crc kubenswrapper[4935]: I1217 09:54:30.134135 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" gracePeriod=600 Dec 17 09:54:32 crc kubenswrapper[4935]: E1217 09:54:32.088726 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:54:32 crc kubenswrapper[4935]: I1217 09:54:32.296117 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" exitCode=0 Dec 17 09:54:32 crc kubenswrapper[4935]: I1217 09:54:32.296164 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092"} Dec 17 09:54:32 crc kubenswrapper[4935]: I1217 09:54:32.296201 4935 scope.go:117] "RemoveContainer" containerID="bded46501ade12a48f656a8c85415858ab825adaa6120ab69b754451328b7fac" Dec 17 09:54:32 crc kubenswrapper[4935]: I1217 09:54:32.296764 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:54:32 crc kubenswrapper[4935]: E1217 09:54:32.297057 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:54:43 crc kubenswrapper[4935]: I1217 09:54:43.124643 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:54:43 crc kubenswrapper[4935]: E1217 09:54:43.125750 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:54:57 crc kubenswrapper[4935]: I1217 09:54:57.124390 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:54:57 crc kubenswrapper[4935]: E1217 09:54:57.125420 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:55:08 crc kubenswrapper[4935]: I1217 09:55:08.124487 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:55:08 crc kubenswrapper[4935]: E1217 09:55:08.125627 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.720095 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9rpks"] Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.725038 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.783949 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rpks"] Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.784647 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-catalog-content\") pod \"redhat-marketplace-9rpks\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.784680 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knllf\" (UniqueName: \"kubernetes.io/projected/4ca90774-47fa-4a57-8153-dd6c7e514300-kube-api-access-knllf\") pod \"redhat-marketplace-9rpks\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.784769 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-utilities\") pod \"redhat-marketplace-9rpks\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.886971 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-catalog-content\") pod \"redhat-marketplace-9rpks\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.887014 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knllf\" (UniqueName: \"kubernetes.io/projected/4ca90774-47fa-4a57-8153-dd6c7e514300-kube-api-access-knllf\") pod \"redhat-marketplace-9rpks\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.887060 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-utilities\") pod \"redhat-marketplace-9rpks\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.887564 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-utilities\") pod \"redhat-marketplace-9rpks\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.887919 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-catalog-content\") pod \"redhat-marketplace-9rpks\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:17 crc kubenswrapper[4935]: I1217 09:55:17.908754 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knllf\" (UniqueName: \"kubernetes.io/projected/4ca90774-47fa-4a57-8153-dd6c7e514300-kube-api-access-knllf\") pod \"redhat-marketplace-9rpks\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:18 crc kubenswrapper[4935]: I1217 09:55:18.053692 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:18 crc kubenswrapper[4935]: I1217 09:55:18.525928 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rpks"] Dec 17 09:55:18 crc kubenswrapper[4935]: I1217 09:55:18.793339 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rpks" event={"ID":"4ca90774-47fa-4a57-8153-dd6c7e514300","Type":"ContainerStarted","Data":"87bfb411f122829ccda5df312e2a80abca9caf1e218c294db8e87806693d4eb9"} Dec 17 09:55:19 crc kubenswrapper[4935]: I1217 09:55:19.124468 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:55:19 crc kubenswrapper[4935]: E1217 09:55:19.124764 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:55:19 crc kubenswrapper[4935]: I1217 09:55:19.803015 4935 generic.go:334] "Generic (PLEG): container finished" podID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerID="2f28e32c29cf1757dedd6af9159a28c94d86ff47b66ade13b315310029606942" exitCode=0 Dec 17 09:55:19 crc kubenswrapper[4935]: I1217 09:55:19.803067 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rpks" event={"ID":"4ca90774-47fa-4a57-8153-dd6c7e514300","Type":"ContainerDied","Data":"2f28e32c29cf1757dedd6af9159a28c94d86ff47b66ade13b315310029606942"} Dec 17 09:55:21 crc kubenswrapper[4935]: I1217 09:55:21.829413 4935 generic.go:334] "Generic (PLEG): container finished" podID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerID="f1e8ced55566b42adc2306bd6427adf9cb74b45a7aee5842f99e0ac6ba0cbceb" exitCode=0 Dec 17 09:55:21 crc kubenswrapper[4935]: I1217 09:55:21.830112 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rpks" event={"ID":"4ca90774-47fa-4a57-8153-dd6c7e514300","Type":"ContainerDied","Data":"f1e8ced55566b42adc2306bd6427adf9cb74b45a7aee5842f99e0ac6ba0cbceb"} Dec 17 09:55:23 crc kubenswrapper[4935]: I1217 09:55:23.851717 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rpks" event={"ID":"4ca90774-47fa-4a57-8153-dd6c7e514300","Type":"ContainerStarted","Data":"fc44da89dcf820eca3468c98b490a84b5a46bf74a4667b515d707ec58d4a249c"} Dec 17 09:55:23 crc kubenswrapper[4935]: I1217 09:55:23.881185 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9rpks" podStartSLOduration=3.834983286 podStartE2EDuration="6.881163478s" podCreationTimestamp="2025-12-17 09:55:17 +0000 UTC" firstStartedPulling="2025-12-17 09:55:19.804867159 +0000 UTC m=+3039.464707932" lastFinishedPulling="2025-12-17 09:55:22.851047361 +0000 UTC m=+3042.510888124" observedRunningTime="2025-12-17 09:55:23.865908962 +0000 UTC m=+3043.525749735" watchObservedRunningTime="2025-12-17 09:55:23.881163478 +0000 UTC m=+3043.541004241" Dec 17 09:55:28 crc kubenswrapper[4935]: I1217 09:55:28.054036 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:28 crc kubenswrapper[4935]: I1217 09:55:28.054703 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:28 crc kubenswrapper[4935]: I1217 09:55:28.100416 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:28 crc kubenswrapper[4935]: I1217 09:55:28.945740 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:29 crc kubenswrapper[4935]: I1217 09:55:29.003959 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rpks"] Dec 17 09:55:30 crc kubenswrapper[4935]: I1217 09:55:30.918730 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9rpks" podUID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerName="registry-server" containerID="cri-o://fc44da89dcf820eca3468c98b490a84b5a46bf74a4667b515d707ec58d4a249c" gracePeriod=2 Dec 17 09:55:32 crc kubenswrapper[4935]: I1217 09:55:32.938748 4935 generic.go:334] "Generic (PLEG): container finished" podID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerID="fc44da89dcf820eca3468c98b490a84b5a46bf74a4667b515d707ec58d4a249c" exitCode=0 Dec 17 09:55:32 crc kubenswrapper[4935]: I1217 09:55:32.938818 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rpks" event={"ID":"4ca90774-47fa-4a57-8153-dd6c7e514300","Type":"ContainerDied","Data":"fc44da89dcf820eca3468c98b490a84b5a46bf74a4667b515d707ec58d4a249c"} Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.128138 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:55:33 crc kubenswrapper[4935]: E1217 09:55:33.128509 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.296329 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.397983 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knllf\" (UniqueName: \"kubernetes.io/projected/4ca90774-47fa-4a57-8153-dd6c7e514300-kube-api-access-knllf\") pod \"4ca90774-47fa-4a57-8153-dd6c7e514300\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.398154 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-utilities\") pod \"4ca90774-47fa-4a57-8153-dd6c7e514300\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.398173 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-catalog-content\") pod \"4ca90774-47fa-4a57-8153-dd6c7e514300\" (UID: \"4ca90774-47fa-4a57-8153-dd6c7e514300\") " Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.399162 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-utilities" (OuterVolumeSpecName: "utilities") pod "4ca90774-47fa-4a57-8153-dd6c7e514300" (UID: "4ca90774-47fa-4a57-8153-dd6c7e514300"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.404549 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca90774-47fa-4a57-8153-dd6c7e514300-kube-api-access-knllf" (OuterVolumeSpecName: "kube-api-access-knllf") pod "4ca90774-47fa-4a57-8153-dd6c7e514300" (UID: "4ca90774-47fa-4a57-8153-dd6c7e514300"). InnerVolumeSpecName "kube-api-access-knllf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.417594 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ca90774-47fa-4a57-8153-dd6c7e514300" (UID: "4ca90774-47fa-4a57-8153-dd6c7e514300"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.500999 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knllf\" (UniqueName: \"kubernetes.io/projected/4ca90774-47fa-4a57-8153-dd6c7e514300-kube-api-access-knllf\") on node \"crc\" DevicePath \"\"" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.501035 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.501045 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ca90774-47fa-4a57-8153-dd6c7e514300-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.949226 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9rpks" event={"ID":"4ca90774-47fa-4a57-8153-dd6c7e514300","Type":"ContainerDied","Data":"87bfb411f122829ccda5df312e2a80abca9caf1e218c294db8e87806693d4eb9"} Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.949292 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9rpks" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.949670 4935 scope.go:117] "RemoveContainer" containerID="fc44da89dcf820eca3468c98b490a84b5a46bf74a4667b515d707ec58d4a249c" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.972872 4935 scope.go:117] "RemoveContainer" containerID="f1e8ced55566b42adc2306bd6427adf9cb74b45a7aee5842f99e0ac6ba0cbceb" Dec 17 09:55:33 crc kubenswrapper[4935]: I1217 09:55:33.987926 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rpks"] Dec 17 09:55:34 crc kubenswrapper[4935]: I1217 09:55:34.002331 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9rpks"] Dec 17 09:55:34 crc kubenswrapper[4935]: I1217 09:55:34.018364 4935 scope.go:117] "RemoveContainer" containerID="2f28e32c29cf1757dedd6af9159a28c94d86ff47b66ade13b315310029606942" Dec 17 09:55:35 crc kubenswrapper[4935]: I1217 09:55:35.135613 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca90774-47fa-4a57-8153-dd6c7e514300" path="/var/lib/kubelet/pods/4ca90774-47fa-4a57-8153-dd6c7e514300/volumes" Dec 17 09:55:47 crc kubenswrapper[4935]: I1217 09:55:47.124780 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:55:47 crc kubenswrapper[4935]: E1217 09:55:47.125648 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:56:00 crc kubenswrapper[4935]: I1217 09:56:00.124790 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:56:00 crc kubenswrapper[4935]: E1217 09:56:00.125754 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:56:14 crc kubenswrapper[4935]: I1217 09:56:14.124867 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:56:14 crc kubenswrapper[4935]: E1217 09:56:14.125679 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:56:25 crc kubenswrapper[4935]: I1217 09:56:25.124700 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:56:25 crc kubenswrapper[4935]: E1217 09:56:25.126691 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:56:40 crc kubenswrapper[4935]: I1217 09:56:40.124759 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:56:40 crc kubenswrapper[4935]: E1217 09:56:40.125748 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:56:55 crc kubenswrapper[4935]: I1217 09:56:55.125093 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:56:55 crc kubenswrapper[4935]: E1217 09:56:55.126119 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:57:06 crc kubenswrapper[4935]: I1217 09:57:06.124477 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:57:06 crc kubenswrapper[4935]: E1217 09:57:06.125562 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:57:21 crc kubenswrapper[4935]: I1217 09:57:21.133472 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:57:21 crc kubenswrapper[4935]: E1217 09:57:21.134658 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:57:36 crc kubenswrapper[4935]: I1217 09:57:36.125824 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:57:36 crc kubenswrapper[4935]: E1217 09:57:36.126877 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:57:47 crc kubenswrapper[4935]: I1217 09:57:47.124677 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:57:47 crc kubenswrapper[4935]: E1217 09:57:47.125546 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:58:01 crc kubenswrapper[4935]: I1217 09:58:01.130828 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:58:01 crc kubenswrapper[4935]: E1217 09:58:01.131827 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:58:14 crc kubenswrapper[4935]: I1217 09:58:14.125183 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:58:14 crc kubenswrapper[4935]: E1217 09:58:14.126166 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:58:25 crc kubenswrapper[4935]: I1217 09:58:25.124473 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:58:25 crc kubenswrapper[4935]: E1217 09:58:25.125675 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:58:37 crc kubenswrapper[4935]: I1217 09:58:37.124851 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:58:37 crc kubenswrapper[4935]: E1217 09:58:37.125805 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:58:49 crc kubenswrapper[4935]: I1217 09:58:49.124255 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:58:49 crc kubenswrapper[4935]: E1217 09:58:49.125087 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:59:03 crc kubenswrapper[4935]: I1217 09:59:03.124859 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:59:03 crc kubenswrapper[4935]: E1217 09:59:03.125897 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:59:16 crc kubenswrapper[4935]: I1217 09:59:16.124965 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:59:16 crc kubenswrapper[4935]: E1217 09:59:16.126153 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:59:31 crc kubenswrapper[4935]: I1217 09:59:31.139667 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:59:31 crc kubenswrapper[4935]: E1217 09:59:31.141089 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 09:59:45 crc kubenswrapper[4935]: I1217 09:59:45.124578 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 09:59:45 crc kubenswrapper[4935]: I1217 09:59:45.414865 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"3fedab7535056f58f455e566301bd7d5ef66bd45dfcef7177d9b174d3b301c2b"} Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.148396 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz"] Dec 17 10:00:00 crc kubenswrapper[4935]: E1217 10:00:00.149631 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerName="extract-content" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.149651 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerName="extract-content" Dec 17 10:00:00 crc kubenswrapper[4935]: E1217 10:00:00.149690 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerName="extract-utilities" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.149698 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerName="extract-utilities" Dec 17 10:00:00 crc kubenswrapper[4935]: E1217 10:00:00.149724 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerName="registry-server" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.149732 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerName="registry-server" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.149973 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca90774-47fa-4a57-8153-dd6c7e514300" containerName="registry-server" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.150845 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.152985 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.153817 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.162529 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz"] Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.232092 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcxbt\" (UniqueName: \"kubernetes.io/projected/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-kube-api-access-vcxbt\") pod \"collect-profiles-29432760-hz6kz\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.232202 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-secret-volume\") pod \"collect-profiles-29432760-hz6kz\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.232452 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-config-volume\") pod \"collect-profiles-29432760-hz6kz\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.334685 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcxbt\" (UniqueName: \"kubernetes.io/projected/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-kube-api-access-vcxbt\") pod \"collect-profiles-29432760-hz6kz\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.335152 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-secret-volume\") pod \"collect-profiles-29432760-hz6kz\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.335258 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-config-volume\") pod \"collect-profiles-29432760-hz6kz\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.336430 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-config-volume\") pod \"collect-profiles-29432760-hz6kz\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.342047 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-secret-volume\") pod \"collect-profiles-29432760-hz6kz\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.352756 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcxbt\" (UniqueName: \"kubernetes.io/projected/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-kube-api-access-vcxbt\") pod \"collect-profiles-29432760-hz6kz\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.471152 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:00 crc kubenswrapper[4935]: I1217 10:00:00.958230 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz"] Dec 17 10:00:00 crc kubenswrapper[4935]: W1217 10:00:00.964316 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e36b8f4_f6cb_450c_bffc_ae202e8b3eef.slice/crio-6e73cbcdfec14c928a5d87fa4c2a7f1c1d37ae1e4a191eb411862780f0779b62 WatchSource:0}: Error finding container 6e73cbcdfec14c928a5d87fa4c2a7f1c1d37ae1e4a191eb411862780f0779b62: Status 404 returned error can't find the container with id 6e73cbcdfec14c928a5d87fa4c2a7f1c1d37ae1e4a191eb411862780f0779b62 Dec 17 10:00:01 crc kubenswrapper[4935]: I1217 10:00:01.567089 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" event={"ID":"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef","Type":"ContainerStarted","Data":"d381c39b69726676d54e9fb18a7e903f822f96822ee9a36438b4e1fe52a4def0"} Dec 17 10:00:01 crc kubenswrapper[4935]: I1217 10:00:01.567536 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" event={"ID":"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef","Type":"ContainerStarted","Data":"6e73cbcdfec14c928a5d87fa4c2a7f1c1d37ae1e4a191eb411862780f0779b62"} Dec 17 10:00:02 crc kubenswrapper[4935]: I1217 10:00:02.577235 4935 generic.go:334] "Generic (PLEG): container finished" podID="2e36b8f4-f6cb-450c-bffc-ae202e8b3eef" containerID="d381c39b69726676d54e9fb18a7e903f822f96822ee9a36438b4e1fe52a4def0" exitCode=0 Dec 17 10:00:02 crc kubenswrapper[4935]: I1217 10:00:02.577645 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" event={"ID":"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef","Type":"ContainerDied","Data":"d381c39b69726676d54e9fb18a7e903f822f96822ee9a36438b4e1fe52a4def0"} Dec 17 10:00:03 crc kubenswrapper[4935]: I1217 10:00:03.976872 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.107813 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-config-volume\") pod \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.107971 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcxbt\" (UniqueName: \"kubernetes.io/projected/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-kube-api-access-vcxbt\") pod \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.108162 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-secret-volume\") pod \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\" (UID: \"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef\") " Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.108850 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-config-volume" (OuterVolumeSpecName: "config-volume") pod "2e36b8f4-f6cb-450c-bffc-ae202e8b3eef" (UID: "2e36b8f4-f6cb-450c-bffc-ae202e8b3eef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.119720 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-kube-api-access-vcxbt" (OuterVolumeSpecName: "kube-api-access-vcxbt") pod "2e36b8f4-f6cb-450c-bffc-ae202e8b3eef" (UID: "2e36b8f4-f6cb-450c-bffc-ae202e8b3eef"). InnerVolumeSpecName "kube-api-access-vcxbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.119802 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2e36b8f4-f6cb-450c-bffc-ae202e8b3eef" (UID: "2e36b8f4-f6cb-450c-bffc-ae202e8b3eef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.209969 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcxbt\" (UniqueName: \"kubernetes.io/projected/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-kube-api-access-vcxbt\") on node \"crc\" DevicePath \"\"" Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.210228 4935 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.210237 4935 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2e36b8f4-f6cb-450c-bffc-ae202e8b3eef-config-volume\") on node \"crc\" DevicePath \"\"" Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.598054 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" event={"ID":"2e36b8f4-f6cb-450c-bffc-ae202e8b3eef","Type":"ContainerDied","Data":"6e73cbcdfec14c928a5d87fa4c2a7f1c1d37ae1e4a191eb411862780f0779b62"} Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.598097 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e73cbcdfec14c928a5d87fa4c2a7f1c1d37ae1e4a191eb411862780f0779b62" Dec 17 10:00:04 crc kubenswrapper[4935]: I1217 10:00:04.598183 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432760-hz6kz" Dec 17 10:00:05 crc kubenswrapper[4935]: I1217 10:00:05.052263 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5"] Dec 17 10:00:05 crc kubenswrapper[4935]: I1217 10:00:05.059989 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432715-p95b5"] Dec 17 10:00:05 crc kubenswrapper[4935]: I1217 10:00:05.135014 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9411393b-db02-442f-a053-05760712be53" path="/var/lib/kubelet/pods/9411393b-db02-442f-a053-05760712be53/volumes" Dec 17 10:00:13 crc kubenswrapper[4935]: I1217 10:00:13.473735 4935 scope.go:117] "RemoveContainer" containerID="55228e4591950f0dbf1234ba423d05d343a7c3c6f7f65e081d916de0945a3440" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.150907 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29432761-2h8vv"] Dec 17 10:01:00 crc kubenswrapper[4935]: E1217 10:01:00.152026 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e36b8f4-f6cb-450c-bffc-ae202e8b3eef" containerName="collect-profiles" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.152042 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e36b8f4-f6cb-450c-bffc-ae202e8b3eef" containerName="collect-profiles" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.152230 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e36b8f4-f6cb-450c-bffc-ae202e8b3eef" containerName="collect-profiles" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.152926 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.159789 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29432761-2h8vv"] Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.246106 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-combined-ca-bundle\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.246587 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r8h6\" (UniqueName: \"kubernetes.io/projected/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-kube-api-access-5r8h6\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.246918 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-fernet-keys\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.247052 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-config-data\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.348747 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-fernet-keys\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.348839 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-config-data\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.348878 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-combined-ca-bundle\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.348901 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r8h6\" (UniqueName: \"kubernetes.io/projected/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-kube-api-access-5r8h6\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.356056 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-config-data\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.356599 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-fernet-keys\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.360025 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-combined-ca-bundle\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.367546 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r8h6\" (UniqueName: \"kubernetes.io/projected/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-kube-api-access-5r8h6\") pod \"keystone-cron-29432761-2h8vv\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.477962 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:00 crc kubenswrapper[4935]: I1217 10:01:00.977927 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29432761-2h8vv"] Dec 17 10:01:01 crc kubenswrapper[4935]: I1217 10:01:01.105718 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29432761-2h8vv" event={"ID":"6204d422-0f4a-40eb-a3ed-eb53e9220c9e","Type":"ContainerStarted","Data":"ad8c400c821d5b791c4ed5a80ed09b277cad537ffd7a4160c5b79d825a7d975b"} Dec 17 10:01:02 crc kubenswrapper[4935]: I1217 10:01:02.522514 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29432761-2h8vv" event={"ID":"6204d422-0f4a-40eb-a3ed-eb53e9220c9e","Type":"ContainerStarted","Data":"db3842ada957cf3086f095350ab2bff9602e6693e770376aad9ea312d8d31186"} Dec 17 10:01:02 crc kubenswrapper[4935]: I1217 10:01:02.588857 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29432761-2h8vv" podStartSLOduration=2.5888323619999998 podStartE2EDuration="2.588832362s" podCreationTimestamp="2025-12-17 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 10:01:02.574666835 +0000 UTC m=+3382.234507598" watchObservedRunningTime="2025-12-17 10:01:02.588832362 +0000 UTC m=+3382.248673125" Dec 17 10:01:03 crc kubenswrapper[4935]: I1217 10:01:03.544219 4935 generic.go:334] "Generic (PLEG): container finished" podID="6204d422-0f4a-40eb-a3ed-eb53e9220c9e" containerID="db3842ada957cf3086f095350ab2bff9602e6693e770376aad9ea312d8d31186" exitCode=0 Dec 17 10:01:03 crc kubenswrapper[4935]: I1217 10:01:03.544367 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29432761-2h8vv" event={"ID":"6204d422-0f4a-40eb-a3ed-eb53e9220c9e","Type":"ContainerDied","Data":"db3842ada957cf3086f095350ab2bff9602e6693e770376aad9ea312d8d31186"} Dec 17 10:01:04 crc kubenswrapper[4935]: I1217 10:01:04.911650 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.025506 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r8h6\" (UniqueName: \"kubernetes.io/projected/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-kube-api-access-5r8h6\") pod \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.025614 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-config-data\") pod \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.025864 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-fernet-keys\") pod \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.025902 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-combined-ca-bundle\") pod \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\" (UID: \"6204d422-0f4a-40eb-a3ed-eb53e9220c9e\") " Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.031690 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-kube-api-access-5r8h6" (OuterVolumeSpecName: "kube-api-access-5r8h6") pod "6204d422-0f4a-40eb-a3ed-eb53e9220c9e" (UID: "6204d422-0f4a-40eb-a3ed-eb53e9220c9e"). InnerVolumeSpecName "kube-api-access-5r8h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.032049 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6204d422-0f4a-40eb-a3ed-eb53e9220c9e" (UID: "6204d422-0f4a-40eb-a3ed-eb53e9220c9e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.053529 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6204d422-0f4a-40eb-a3ed-eb53e9220c9e" (UID: "6204d422-0f4a-40eb-a3ed-eb53e9220c9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.075088 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-config-data" (OuterVolumeSpecName: "config-data") pod "6204d422-0f4a-40eb-a3ed-eb53e9220c9e" (UID: "6204d422-0f4a-40eb-a3ed-eb53e9220c9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.130203 4935 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.130231 4935 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.130241 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r8h6\" (UniqueName: \"kubernetes.io/projected/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-kube-api-access-5r8h6\") on node \"crc\" DevicePath \"\"" Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.130250 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6204d422-0f4a-40eb-a3ed-eb53e9220c9e-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.564326 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29432761-2h8vv" event={"ID":"6204d422-0f4a-40eb-a3ed-eb53e9220c9e","Type":"ContainerDied","Data":"ad8c400c821d5b791c4ed5a80ed09b277cad537ffd7a4160c5b79d825a7d975b"} Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.564366 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad8c400c821d5b791c4ed5a80ed09b277cad537ffd7a4160c5b79d825a7d975b" Dec 17 10:01:05 crc kubenswrapper[4935]: I1217 10:01:05.564380 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29432761-2h8vv" Dec 17 10:02:00 crc kubenswrapper[4935]: I1217 10:02:00.131393 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:02:00 crc kubenswrapper[4935]: I1217 10:02:00.132360 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.133909 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vqcdm"] Dec 17 10:02:23 crc kubenswrapper[4935]: E1217 10:02:23.134986 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6204d422-0f4a-40eb-a3ed-eb53e9220c9e" containerName="keystone-cron" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.135001 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6204d422-0f4a-40eb-a3ed-eb53e9220c9e" containerName="keystone-cron" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.135213 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6204d422-0f4a-40eb-a3ed-eb53e9220c9e" containerName="keystone-cron" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.136732 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.146998 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqcdm"] Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.253824 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-utilities\") pod \"community-operators-vqcdm\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.254186 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-catalog-content\") pod \"community-operators-vqcdm\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.254225 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgfzg\" (UniqueName: \"kubernetes.io/projected/313b0503-c44b-412b-bf47-4248f3c6e28c-kube-api-access-rgfzg\") pod \"community-operators-vqcdm\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.356973 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-catalog-content\") pod \"community-operators-vqcdm\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.357009 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgfzg\" (UniqueName: \"kubernetes.io/projected/313b0503-c44b-412b-bf47-4248f3c6e28c-kube-api-access-rgfzg\") pod \"community-operators-vqcdm\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.357072 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-utilities\") pod \"community-operators-vqcdm\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.357465 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-catalog-content\") pod \"community-operators-vqcdm\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.357537 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-utilities\") pod \"community-operators-vqcdm\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.384528 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgfzg\" (UniqueName: \"kubernetes.io/projected/313b0503-c44b-412b-bf47-4248f3c6e28c-kube-api-access-rgfzg\") pod \"community-operators-vqcdm\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:23 crc kubenswrapper[4935]: I1217 10:02:23.468840 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:24 crc kubenswrapper[4935]: I1217 10:02:24.059057 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vqcdm"] Dec 17 10:02:24 crc kubenswrapper[4935]: I1217 10:02:24.249461 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqcdm" event={"ID":"313b0503-c44b-412b-bf47-4248f3c6e28c","Type":"ContainerStarted","Data":"148b1478d5a24525ce398433ffab081bf258db8ed2d7a2ffed320d9a05fa01f3"} Dec 17 10:02:25 crc kubenswrapper[4935]: I1217 10:02:25.262417 4935 generic.go:334] "Generic (PLEG): container finished" podID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerID="a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea" exitCode=0 Dec 17 10:02:25 crc kubenswrapper[4935]: I1217 10:02:25.262510 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqcdm" event={"ID":"313b0503-c44b-412b-bf47-4248f3c6e28c","Type":"ContainerDied","Data":"a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea"} Dec 17 10:02:25 crc kubenswrapper[4935]: I1217 10:02:25.265928 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 10:02:26 crc kubenswrapper[4935]: I1217 10:02:26.273313 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqcdm" event={"ID":"313b0503-c44b-412b-bf47-4248f3c6e28c","Type":"ContainerStarted","Data":"71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9"} Dec 17 10:02:27 crc kubenswrapper[4935]: I1217 10:02:27.283674 4935 generic.go:334] "Generic (PLEG): container finished" podID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerID="71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9" exitCode=0 Dec 17 10:02:27 crc kubenswrapper[4935]: I1217 10:02:27.283729 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqcdm" event={"ID":"313b0503-c44b-412b-bf47-4248f3c6e28c","Type":"ContainerDied","Data":"71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9"} Dec 17 10:02:28 crc kubenswrapper[4935]: I1217 10:02:28.295940 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqcdm" event={"ID":"313b0503-c44b-412b-bf47-4248f3c6e28c","Type":"ContainerStarted","Data":"ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7"} Dec 17 10:02:28 crc kubenswrapper[4935]: I1217 10:02:28.319668 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vqcdm" podStartSLOduration=2.6025500839999998 podStartE2EDuration="5.319651731s" podCreationTimestamp="2025-12-17 10:02:23 +0000 UTC" firstStartedPulling="2025-12-17 10:02:25.265692148 +0000 UTC m=+3464.925532911" lastFinishedPulling="2025-12-17 10:02:27.982793805 +0000 UTC m=+3467.642634558" observedRunningTime="2025-12-17 10:02:28.31347523 +0000 UTC m=+3467.973316003" watchObservedRunningTime="2025-12-17 10:02:28.319651731 +0000 UTC m=+3467.979492494" Dec 17 10:02:30 crc kubenswrapper[4935]: I1217 10:02:30.130816 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:02:30 crc kubenswrapper[4935]: I1217 10:02:30.131312 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:02:33 crc kubenswrapper[4935]: I1217 10:02:33.469045 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:33 crc kubenswrapper[4935]: I1217 10:02:33.469837 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:33 crc kubenswrapper[4935]: I1217 10:02:33.516774 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:34 crc kubenswrapper[4935]: I1217 10:02:34.420299 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:34 crc kubenswrapper[4935]: I1217 10:02:34.484566 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqcdm"] Dec 17 10:02:36 crc kubenswrapper[4935]: I1217 10:02:36.368520 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vqcdm" podUID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerName="registry-server" containerID="cri-o://ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7" gracePeriod=2 Dec 17 10:02:36 crc kubenswrapper[4935]: I1217 10:02:36.852318 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:36 crc kubenswrapper[4935]: I1217 10:02:36.926595 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-catalog-content\") pod \"313b0503-c44b-412b-bf47-4248f3c6e28c\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " Dec 17 10:02:36 crc kubenswrapper[4935]: I1217 10:02:36.926710 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgfzg\" (UniqueName: \"kubernetes.io/projected/313b0503-c44b-412b-bf47-4248f3c6e28c-kube-api-access-rgfzg\") pod \"313b0503-c44b-412b-bf47-4248f3c6e28c\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " Dec 17 10:02:36 crc kubenswrapper[4935]: I1217 10:02:36.926811 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-utilities\") pod \"313b0503-c44b-412b-bf47-4248f3c6e28c\" (UID: \"313b0503-c44b-412b-bf47-4248f3c6e28c\") " Dec 17 10:02:36 crc kubenswrapper[4935]: I1217 10:02:36.927730 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-utilities" (OuterVolumeSpecName: "utilities") pod "313b0503-c44b-412b-bf47-4248f3c6e28c" (UID: "313b0503-c44b-412b-bf47-4248f3c6e28c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:02:36 crc kubenswrapper[4935]: I1217 10:02:36.932527 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313b0503-c44b-412b-bf47-4248f3c6e28c-kube-api-access-rgfzg" (OuterVolumeSpecName: "kube-api-access-rgfzg") pod "313b0503-c44b-412b-bf47-4248f3c6e28c" (UID: "313b0503-c44b-412b-bf47-4248f3c6e28c"). InnerVolumeSpecName "kube-api-access-rgfzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:02:36 crc kubenswrapper[4935]: I1217 10:02:36.972889 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "313b0503-c44b-412b-bf47-4248f3c6e28c" (UID: "313b0503-c44b-412b-bf47-4248f3c6e28c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.029066 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.029110 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgfzg\" (UniqueName: \"kubernetes.io/projected/313b0503-c44b-412b-bf47-4248f3c6e28c-kube-api-access-rgfzg\") on node \"crc\" DevicePath \"\"" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.029127 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/313b0503-c44b-412b-bf47-4248f3c6e28c-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.380299 4935 generic.go:334] "Generic (PLEG): container finished" podID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerID="ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7" exitCode=0 Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.380340 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqcdm" event={"ID":"313b0503-c44b-412b-bf47-4248f3c6e28c","Type":"ContainerDied","Data":"ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7"} Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.380373 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vqcdm" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.380400 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vqcdm" event={"ID":"313b0503-c44b-412b-bf47-4248f3c6e28c","Type":"ContainerDied","Data":"148b1478d5a24525ce398433ffab081bf258db8ed2d7a2ffed320d9a05fa01f3"} Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.380423 4935 scope.go:117] "RemoveContainer" containerID="ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.409358 4935 scope.go:117] "RemoveContainer" containerID="71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.415085 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vqcdm"] Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.427018 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vqcdm"] Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.439628 4935 scope.go:117] "RemoveContainer" containerID="a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.487343 4935 scope.go:117] "RemoveContainer" containerID="ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7" Dec 17 10:02:37 crc kubenswrapper[4935]: E1217 10:02:37.487961 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7\": container with ID starting with ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7 not found: ID does not exist" containerID="ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.487990 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7"} err="failed to get container status \"ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7\": rpc error: code = NotFound desc = could not find container \"ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7\": container with ID starting with ec73a5b3ace7679e47d50c66fb00a2b2eecb166703eb8fbe3c7d937a6abf5bc7 not found: ID does not exist" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.488014 4935 scope.go:117] "RemoveContainer" containerID="71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9" Dec 17 10:02:37 crc kubenswrapper[4935]: E1217 10:02:37.488534 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9\": container with ID starting with 71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9 not found: ID does not exist" containerID="71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.488577 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9"} err="failed to get container status \"71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9\": rpc error: code = NotFound desc = could not find container \"71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9\": container with ID starting with 71bf7217ed96acb0572acd5c8e5f5632de188bb142821921c89ee24c5bf2d8e9 not found: ID does not exist" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.488607 4935 scope.go:117] "RemoveContainer" containerID="a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea" Dec 17 10:02:37 crc kubenswrapper[4935]: E1217 10:02:37.489078 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea\": container with ID starting with a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea not found: ID does not exist" containerID="a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea" Dec 17 10:02:37 crc kubenswrapper[4935]: I1217 10:02:37.489105 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea"} err="failed to get container status \"a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea\": rpc error: code = NotFound desc = could not find container \"a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea\": container with ID starting with a5c0727e4273d4d1a0ef2fb4e6e2b683b3ba87a38127ade761fc62240979bdea not found: ID does not exist" Dec 17 10:02:39 crc kubenswrapper[4935]: I1217 10:02:39.138846 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313b0503-c44b-412b-bf47-4248f3c6e28c" path="/var/lib/kubelet/pods/313b0503-c44b-412b-bf47-4248f3c6e28c/volumes" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.626872 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j4lh9"] Dec 17 10:02:59 crc kubenswrapper[4935]: E1217 10:02:59.628059 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerName="registry-server" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.628078 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerName="registry-server" Dec 17 10:02:59 crc kubenswrapper[4935]: E1217 10:02:59.628103 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerName="extract-utilities" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.628112 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerName="extract-utilities" Dec 17 10:02:59 crc kubenswrapper[4935]: E1217 10:02:59.628121 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerName="extract-content" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.628129 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerName="extract-content" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.628393 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="313b0503-c44b-412b-bf47-4248f3c6e28c" containerName="registry-server" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.630082 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.635165 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j4lh9"] Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.682629 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-utilities\") pod \"redhat-operators-j4lh9\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.682692 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfn4b\" (UniqueName: \"kubernetes.io/projected/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-kube-api-access-tfn4b\") pod \"redhat-operators-j4lh9\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.682737 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-catalog-content\") pod \"redhat-operators-j4lh9\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.784495 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-utilities\") pod \"redhat-operators-j4lh9\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.784564 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfn4b\" (UniqueName: \"kubernetes.io/projected/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-kube-api-access-tfn4b\") pod \"redhat-operators-j4lh9\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.784613 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-catalog-content\") pod \"redhat-operators-j4lh9\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.785159 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-catalog-content\") pod \"redhat-operators-j4lh9\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.785414 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-utilities\") pod \"redhat-operators-j4lh9\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.816809 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfn4b\" (UniqueName: \"kubernetes.io/projected/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-kube-api-access-tfn4b\") pod \"redhat-operators-j4lh9\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:02:59 crc kubenswrapper[4935]: I1217 10:02:59.953001 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.130428 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.130492 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.130546 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.131326 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3fedab7535056f58f455e566301bd7d5ef66bd45dfcef7177d9b174d3b301c2b"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.131401 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://3fedab7535056f58f455e566301bd7d5ef66bd45dfcef7177d9b174d3b301c2b" gracePeriod=600 Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.572294 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j4lh9"] Dec 17 10:03:00 crc kubenswrapper[4935]: W1217 10:03:00.574493 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eeedd08_da84_40ae_97ff_e19a5d3c67bd.slice/crio-7b757f689781011865a80f68108d297a56785df90fb45b7916eb64ef31448045 WatchSource:0}: Error finding container 7b757f689781011865a80f68108d297a56785df90fb45b7916eb64ef31448045: Status 404 returned error can't find the container with id 7b757f689781011865a80f68108d297a56785df90fb45b7916eb64ef31448045 Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.607926 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="3fedab7535056f58f455e566301bd7d5ef66bd45dfcef7177d9b174d3b301c2b" exitCode=0 Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.607982 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"3fedab7535056f58f455e566301bd7d5ef66bd45dfcef7177d9b174d3b301c2b"} Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.608012 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827"} Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.608028 4935 scope.go:117] "RemoveContainer" containerID="011fa8279d513f1b635ae406b8201ba8d4a6037e9c4fc25de65ed1d04ee3f092" Dec 17 10:03:00 crc kubenswrapper[4935]: I1217 10:03:00.610872 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4lh9" event={"ID":"1eeedd08-da84-40ae-97ff-e19a5d3c67bd","Type":"ContainerStarted","Data":"7b757f689781011865a80f68108d297a56785df90fb45b7916eb64ef31448045"} Dec 17 10:03:01 crc kubenswrapper[4935]: I1217 10:03:01.623549 4935 generic.go:334] "Generic (PLEG): container finished" podID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerID="5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9" exitCode=0 Dec 17 10:03:01 crc kubenswrapper[4935]: I1217 10:03:01.623738 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4lh9" event={"ID":"1eeedd08-da84-40ae-97ff-e19a5d3c67bd","Type":"ContainerDied","Data":"5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9"} Dec 17 10:03:02 crc kubenswrapper[4935]: I1217 10:03:02.634652 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4lh9" event={"ID":"1eeedd08-da84-40ae-97ff-e19a5d3c67bd","Type":"ContainerStarted","Data":"b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f"} Dec 17 10:03:05 crc kubenswrapper[4935]: I1217 10:03:05.660351 4935 generic.go:334] "Generic (PLEG): container finished" podID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerID="b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f" exitCode=0 Dec 17 10:03:05 crc kubenswrapper[4935]: I1217 10:03:05.660404 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4lh9" event={"ID":"1eeedd08-da84-40ae-97ff-e19a5d3c67bd","Type":"ContainerDied","Data":"b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f"} Dec 17 10:03:06 crc kubenswrapper[4935]: I1217 10:03:06.672741 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4lh9" event={"ID":"1eeedd08-da84-40ae-97ff-e19a5d3c67bd","Type":"ContainerStarted","Data":"961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad"} Dec 17 10:03:06 crc kubenswrapper[4935]: I1217 10:03:06.702946 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j4lh9" podStartSLOduration=3.225229578 podStartE2EDuration="7.70292149s" podCreationTimestamp="2025-12-17 10:02:59 +0000 UTC" firstStartedPulling="2025-12-17 10:03:01.626937834 +0000 UTC m=+3501.286778587" lastFinishedPulling="2025-12-17 10:03:06.104629736 +0000 UTC m=+3505.764470499" observedRunningTime="2025-12-17 10:03:06.692533786 +0000 UTC m=+3506.352374549" watchObservedRunningTime="2025-12-17 10:03:06.70292149 +0000 UTC m=+3506.362762253" Dec 17 10:03:09 crc kubenswrapper[4935]: I1217 10:03:09.953424 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:03:09 crc kubenswrapper[4935]: I1217 10:03:09.955086 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:03:11 crc kubenswrapper[4935]: I1217 10:03:11.003171 4935 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j4lh9" podUID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerName="registry-server" probeResult="failure" output=< Dec 17 10:03:11 crc kubenswrapper[4935]: timeout: failed to connect service ":50051" within 1s Dec 17 10:03:11 crc kubenswrapper[4935]: > Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.511711 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xvjjh"] Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.514294 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.541819 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xvjjh"] Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.658848 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-utilities\") pod \"certified-operators-xvjjh\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.658967 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-catalog-content\") pod \"certified-operators-xvjjh\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.659040 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58ps8\" (UniqueName: \"kubernetes.io/projected/32851319-3003-49cd-8be8-9a4076d0f6af-kube-api-access-58ps8\") pod \"certified-operators-xvjjh\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.761088 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-catalog-content\") pod \"certified-operators-xvjjh\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.761174 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58ps8\" (UniqueName: \"kubernetes.io/projected/32851319-3003-49cd-8be8-9a4076d0f6af-kube-api-access-58ps8\") pod \"certified-operators-xvjjh\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.761535 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-utilities\") pod \"certified-operators-xvjjh\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.762100 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-utilities\") pod \"certified-operators-xvjjh\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.762638 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-catalog-content\") pod \"certified-operators-xvjjh\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.797706 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58ps8\" (UniqueName: \"kubernetes.io/projected/32851319-3003-49cd-8be8-9a4076d0f6af-kube-api-access-58ps8\") pod \"certified-operators-xvjjh\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:18 crc kubenswrapper[4935]: I1217 10:03:18.837708 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:19 crc kubenswrapper[4935]: I1217 10:03:19.330309 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xvjjh"] Dec 17 10:03:19 crc kubenswrapper[4935]: I1217 10:03:19.798891 4935 generic.go:334] "Generic (PLEG): container finished" podID="32851319-3003-49cd-8be8-9a4076d0f6af" containerID="29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf" exitCode=0 Dec 17 10:03:19 crc kubenswrapper[4935]: I1217 10:03:19.798999 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjjh" event={"ID":"32851319-3003-49cd-8be8-9a4076d0f6af","Type":"ContainerDied","Data":"29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf"} Dec 17 10:03:19 crc kubenswrapper[4935]: I1217 10:03:19.799575 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjjh" event={"ID":"32851319-3003-49cd-8be8-9a4076d0f6af","Type":"ContainerStarted","Data":"d1f47ebf9fd6750ef8a2fe683f6591801b56e668e68e18366c4047145421ca6b"} Dec 17 10:03:20 crc kubenswrapper[4935]: I1217 10:03:20.000297 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:03:20 crc kubenswrapper[4935]: I1217 10:03:20.046431 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:03:20 crc kubenswrapper[4935]: I1217 10:03:20.905340 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j4lh9"] Dec 17 10:03:21 crc kubenswrapper[4935]: I1217 10:03:21.822155 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjjh" event={"ID":"32851319-3003-49cd-8be8-9a4076d0f6af","Type":"ContainerStarted","Data":"83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9"} Dec 17 10:03:21 crc kubenswrapper[4935]: I1217 10:03:21.822382 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j4lh9" podUID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerName="registry-server" containerID="cri-o://961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad" gracePeriod=2 Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.296574 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.461567 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfn4b\" (UniqueName: \"kubernetes.io/projected/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-kube-api-access-tfn4b\") pod \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.462090 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-catalog-content\") pod \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.462139 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-utilities\") pod \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\" (UID: \"1eeedd08-da84-40ae-97ff-e19a5d3c67bd\") " Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.462672 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-utilities" (OuterVolumeSpecName: "utilities") pod "1eeedd08-da84-40ae-97ff-e19a5d3c67bd" (UID: "1eeedd08-da84-40ae-97ff-e19a5d3c67bd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.463099 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.468757 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-kube-api-access-tfn4b" (OuterVolumeSpecName: "kube-api-access-tfn4b") pod "1eeedd08-da84-40ae-97ff-e19a5d3c67bd" (UID: "1eeedd08-da84-40ae-97ff-e19a5d3c67bd"). InnerVolumeSpecName "kube-api-access-tfn4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.559395 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1eeedd08-da84-40ae-97ff-e19a5d3c67bd" (UID: "1eeedd08-da84-40ae-97ff-e19a5d3c67bd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.564985 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfn4b\" (UniqueName: \"kubernetes.io/projected/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-kube-api-access-tfn4b\") on node \"crc\" DevicePath \"\"" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.565023 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1eeedd08-da84-40ae-97ff-e19a5d3c67bd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.832919 4935 generic.go:334] "Generic (PLEG): container finished" podID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerID="961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad" exitCode=0 Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.832970 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j4lh9" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.832990 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4lh9" event={"ID":"1eeedd08-da84-40ae-97ff-e19a5d3c67bd","Type":"ContainerDied","Data":"961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad"} Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.833053 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j4lh9" event={"ID":"1eeedd08-da84-40ae-97ff-e19a5d3c67bd","Type":"ContainerDied","Data":"7b757f689781011865a80f68108d297a56785df90fb45b7916eb64ef31448045"} Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.833086 4935 scope.go:117] "RemoveContainer" containerID="961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.835417 4935 generic.go:334] "Generic (PLEG): container finished" podID="32851319-3003-49cd-8be8-9a4076d0f6af" containerID="83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9" exitCode=0 Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.835487 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjjh" event={"ID":"32851319-3003-49cd-8be8-9a4076d0f6af","Type":"ContainerDied","Data":"83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9"} Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.860446 4935 scope.go:117] "RemoveContainer" containerID="b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.881349 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j4lh9"] Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.892798 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j4lh9"] Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.895700 4935 scope.go:117] "RemoveContainer" containerID="5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.931366 4935 scope.go:117] "RemoveContainer" containerID="961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad" Dec 17 10:03:22 crc kubenswrapper[4935]: E1217 10:03:22.931928 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad\": container with ID starting with 961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad not found: ID does not exist" containerID="961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.931979 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad"} err="failed to get container status \"961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad\": rpc error: code = NotFound desc = could not find container \"961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad\": container with ID starting with 961904a53c8b22e9b2293eb16c1f34d0df894ef41a7624c7906021ac9e195bad not found: ID does not exist" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.932007 4935 scope.go:117] "RemoveContainer" containerID="b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f" Dec 17 10:03:22 crc kubenswrapper[4935]: E1217 10:03:22.932446 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f\": container with ID starting with b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f not found: ID does not exist" containerID="b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.932484 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f"} err="failed to get container status \"b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f\": rpc error: code = NotFound desc = could not find container \"b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f\": container with ID starting with b08d944647ad8666e1f66c143a7e7155c2d86568efae8ab95a38c4066423b26f not found: ID does not exist" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.932502 4935 scope.go:117] "RemoveContainer" containerID="5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9" Dec 17 10:03:22 crc kubenswrapper[4935]: E1217 10:03:22.932769 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9\": container with ID starting with 5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9 not found: ID does not exist" containerID="5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9" Dec 17 10:03:22 crc kubenswrapper[4935]: I1217 10:03:22.932796 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9"} err="failed to get container status \"5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9\": rpc error: code = NotFound desc = could not find container \"5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9\": container with ID starting with 5ca49ec01b3a4c0fb6a519f96626180e3a9e1348e11e8e93ec20e69e0d584ab9 not found: ID does not exist" Dec 17 10:03:23 crc kubenswrapper[4935]: I1217 10:03:23.134857 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" path="/var/lib/kubelet/pods/1eeedd08-da84-40ae-97ff-e19a5d3c67bd/volumes" Dec 17 10:03:23 crc kubenswrapper[4935]: I1217 10:03:23.849444 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjjh" event={"ID":"32851319-3003-49cd-8be8-9a4076d0f6af","Type":"ContainerStarted","Data":"5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3"} Dec 17 10:03:23 crc kubenswrapper[4935]: I1217 10:03:23.875240 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xvjjh" podStartSLOduration=2.374286107 podStartE2EDuration="5.875218532s" podCreationTimestamp="2025-12-17 10:03:18 +0000 UTC" firstStartedPulling="2025-12-17 10:03:19.8012136 +0000 UTC m=+3519.461054383" lastFinishedPulling="2025-12-17 10:03:23.302146045 +0000 UTC m=+3522.961986808" observedRunningTime="2025-12-17 10:03:23.872918517 +0000 UTC m=+3523.532759280" watchObservedRunningTime="2025-12-17 10:03:23.875218532 +0000 UTC m=+3523.535059295" Dec 17 10:03:28 crc kubenswrapper[4935]: I1217 10:03:28.838491 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:28 crc kubenswrapper[4935]: I1217 10:03:28.840715 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:28 crc kubenswrapper[4935]: I1217 10:03:28.897832 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:28 crc kubenswrapper[4935]: I1217 10:03:28.952172 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:29 crc kubenswrapper[4935]: I1217 10:03:29.138014 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xvjjh"] Dec 17 10:03:30 crc kubenswrapper[4935]: I1217 10:03:30.917625 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xvjjh" podUID="32851319-3003-49cd-8be8-9a4076d0f6af" containerName="registry-server" containerID="cri-o://5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3" gracePeriod=2 Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.389316 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.550596 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58ps8\" (UniqueName: \"kubernetes.io/projected/32851319-3003-49cd-8be8-9a4076d0f6af-kube-api-access-58ps8\") pod \"32851319-3003-49cd-8be8-9a4076d0f6af\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.551036 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-utilities\") pod \"32851319-3003-49cd-8be8-9a4076d0f6af\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.551118 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-catalog-content\") pod \"32851319-3003-49cd-8be8-9a4076d0f6af\" (UID: \"32851319-3003-49cd-8be8-9a4076d0f6af\") " Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.553381 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-utilities" (OuterVolumeSpecName: "utilities") pod "32851319-3003-49cd-8be8-9a4076d0f6af" (UID: "32851319-3003-49cd-8be8-9a4076d0f6af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.556836 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32851319-3003-49cd-8be8-9a4076d0f6af-kube-api-access-58ps8" (OuterVolumeSpecName: "kube-api-access-58ps8") pod "32851319-3003-49cd-8be8-9a4076d0f6af" (UID: "32851319-3003-49cd-8be8-9a4076d0f6af"). InnerVolumeSpecName "kube-api-access-58ps8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.612523 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32851319-3003-49cd-8be8-9a4076d0f6af" (UID: "32851319-3003-49cd-8be8-9a4076d0f6af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.653947 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.653989 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58ps8\" (UniqueName: \"kubernetes.io/projected/32851319-3003-49cd-8be8-9a4076d0f6af-kube-api-access-58ps8\") on node \"crc\" DevicePath \"\"" Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.654001 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32851319-3003-49cd-8be8-9a4076d0f6af-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.927393 4935 generic.go:334] "Generic (PLEG): container finished" podID="32851319-3003-49cd-8be8-9a4076d0f6af" containerID="5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3" exitCode=0 Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.927477 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xvjjh" Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.927490 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjjh" event={"ID":"32851319-3003-49cd-8be8-9a4076d0f6af","Type":"ContainerDied","Data":"5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3"} Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.928416 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xvjjh" event={"ID":"32851319-3003-49cd-8be8-9a4076d0f6af","Type":"ContainerDied","Data":"d1f47ebf9fd6750ef8a2fe683f6591801b56e668e68e18366c4047145421ca6b"} Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.928450 4935 scope.go:117] "RemoveContainer" containerID="5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3" Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.963041 4935 scope.go:117] "RemoveContainer" containerID="83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9" Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.976070 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xvjjh"] Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.987298 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xvjjh"] Dec 17 10:03:31 crc kubenswrapper[4935]: I1217 10:03:31.996991 4935 scope.go:117] "RemoveContainer" containerID="29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf" Dec 17 10:03:32 crc kubenswrapper[4935]: I1217 10:03:32.042283 4935 scope.go:117] "RemoveContainer" containerID="5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3" Dec 17 10:03:32 crc kubenswrapper[4935]: E1217 10:03:32.042807 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3\": container with ID starting with 5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3 not found: ID does not exist" containerID="5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3" Dec 17 10:03:32 crc kubenswrapper[4935]: I1217 10:03:32.042879 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3"} err="failed to get container status \"5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3\": rpc error: code = NotFound desc = could not find container \"5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3\": container with ID starting with 5a1fc466936677e8196d16613bf0ea4f39e9e0a36d98d9869cdfa6851315f5c3 not found: ID does not exist" Dec 17 10:03:32 crc kubenswrapper[4935]: I1217 10:03:32.042924 4935 scope.go:117] "RemoveContainer" containerID="83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9" Dec 17 10:03:32 crc kubenswrapper[4935]: E1217 10:03:32.043365 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9\": container with ID starting with 83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9 not found: ID does not exist" containerID="83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9" Dec 17 10:03:32 crc kubenswrapper[4935]: I1217 10:03:32.043408 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9"} err="failed to get container status \"83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9\": rpc error: code = NotFound desc = could not find container \"83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9\": container with ID starting with 83586f819de8febdf0326ec7f2ff50afa7b4fac69e19c3da4f837454383bfed9 not found: ID does not exist" Dec 17 10:03:32 crc kubenswrapper[4935]: I1217 10:03:32.043444 4935 scope.go:117] "RemoveContainer" containerID="29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf" Dec 17 10:03:32 crc kubenswrapper[4935]: E1217 10:03:32.043928 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf\": container with ID starting with 29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf not found: ID does not exist" containerID="29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf" Dec 17 10:03:32 crc kubenswrapper[4935]: I1217 10:03:32.044012 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf"} err="failed to get container status \"29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf\": rpc error: code = NotFound desc = could not find container \"29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf\": container with ID starting with 29e27262ff76f9cebbcf53e7378d17e56424fd0bc0c3e636c1e92fe8a2fb3eaf not found: ID does not exist" Dec 17 10:03:33 crc kubenswrapper[4935]: I1217 10:03:33.135089 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32851319-3003-49cd-8be8-9a4076d0f6af" path="/var/lib/kubelet/pods/32851319-3003-49cd-8be8-9a4076d0f6af/volumes" Dec 17 10:05:00 crc kubenswrapper[4935]: I1217 10:05:00.130747 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:05:00 crc kubenswrapper[4935]: I1217 10:05:00.131743 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:05:26 crc kubenswrapper[4935]: I1217 10:05:26.954176 4935 generic.go:334] "Generic (PLEG): container finished" podID="36307e10-5953-420e-9627-2812d493abea" containerID="bd7ced6a6394544aca77363a42f4708768ece2ecce1846445befd480020521ee" exitCode=0 Dec 17 10:05:26 crc kubenswrapper[4935]: I1217 10:05:26.954287 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"36307e10-5953-420e-9627-2812d493abea","Type":"ContainerDied","Data":"bd7ced6a6394544aca77363a42f4708768ece2ecce1846445befd480020521ee"} Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.316064 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.388681 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ca-certs\") pod \"36307e10-5953-420e-9627-2812d493abea\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.389141 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-config-data\") pod \"36307e10-5953-420e-9627-2812d493abea\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.389165 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-workdir\") pod \"36307e10-5953-420e-9627-2812d493abea\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.389222 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"36307e10-5953-420e-9627-2812d493abea\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.389264 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-temporary\") pod \"36307e10-5953-420e-9627-2812d493abea\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.389312 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgdww\" (UniqueName: \"kubernetes.io/projected/36307e10-5953-420e-9627-2812d493abea-kube-api-access-qgdww\") pod \"36307e10-5953-420e-9627-2812d493abea\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.389359 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-openstack-config\") pod \"36307e10-5953-420e-9627-2812d493abea\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.389446 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-openstack-config-secret\") pod \"36307e10-5953-420e-9627-2812d493abea\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.389479 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ssh-key\") pod \"36307e10-5953-420e-9627-2812d493abea\" (UID: \"36307e10-5953-420e-9627-2812d493abea\") " Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.389999 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "36307e10-5953-420e-9627-2812d493abea" (UID: "36307e10-5953-420e-9627-2812d493abea"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.390122 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-config-data" (OuterVolumeSpecName: "config-data") pod "36307e10-5953-420e-9627-2812d493abea" (UID: "36307e10-5953-420e-9627-2812d493abea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.394115 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "36307e10-5953-420e-9627-2812d493abea" (UID: "36307e10-5953-420e-9627-2812d493abea"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.398647 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36307e10-5953-420e-9627-2812d493abea-kube-api-access-qgdww" (OuterVolumeSpecName: "kube-api-access-qgdww") pod "36307e10-5953-420e-9627-2812d493abea" (UID: "36307e10-5953-420e-9627-2812d493abea"). InnerVolumeSpecName "kube-api-access-qgdww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.398646 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "36307e10-5953-420e-9627-2812d493abea" (UID: "36307e10-5953-420e-9627-2812d493abea"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.418963 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "36307e10-5953-420e-9627-2812d493abea" (UID: "36307e10-5953-420e-9627-2812d493abea"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.420008 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "36307e10-5953-420e-9627-2812d493abea" (UID: "36307e10-5953-420e-9627-2812d493abea"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.420595 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "36307e10-5953-420e-9627-2812d493abea" (UID: "36307e10-5953-420e-9627-2812d493abea"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.447978 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "36307e10-5953-420e-9627-2812d493abea" (UID: "36307e10-5953-420e-9627-2812d493abea"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.491383 4935 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.491414 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgdww\" (UniqueName: \"kubernetes.io/projected/36307e10-5953-420e-9627-2812d493abea-kube-api-access-qgdww\") on node \"crc\" DevicePath \"\"" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.491424 4935 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.491433 4935 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.491463 4935 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.491472 4935 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/36307e10-5953-420e-9627-2812d493abea-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.491479 4935 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36307e10-5953-420e-9627-2812d493abea-config-data\") on node \"crc\" DevicePath \"\"" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.491488 4935 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/36307e10-5953-420e-9627-2812d493abea-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.491540 4935 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.535004 4935 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.593204 4935 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.973377 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"36307e10-5953-420e-9627-2812d493abea","Type":"ContainerDied","Data":"1eb739047a68985fcf51bb544f07915f6d31552bc129d769c10603983ff0c231"} Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.973443 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 17 10:05:28 crc kubenswrapper[4935]: I1217 10:05:28.973432 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1eb739047a68985fcf51bb544f07915f6d31552bc129d769c10603983ff0c231" Dec 17 10:05:30 crc kubenswrapper[4935]: I1217 10:05:30.131479 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:05:30 crc kubenswrapper[4935]: I1217 10:05:30.131567 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.828888 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 17 10:05:37 crc kubenswrapper[4935]: E1217 10:05:37.830053 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerName="extract-utilities" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.830066 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerName="extract-utilities" Dec 17 10:05:37 crc kubenswrapper[4935]: E1217 10:05:37.830078 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerName="registry-server" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.830083 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerName="registry-server" Dec 17 10:05:37 crc kubenswrapper[4935]: E1217 10:05:37.830092 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerName="extract-content" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.830098 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerName="extract-content" Dec 17 10:05:37 crc kubenswrapper[4935]: E1217 10:05:37.830124 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32851319-3003-49cd-8be8-9a4076d0f6af" containerName="extract-content" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.830130 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="32851319-3003-49cd-8be8-9a4076d0f6af" containerName="extract-content" Dec 17 10:05:37 crc kubenswrapper[4935]: E1217 10:05:37.830142 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32851319-3003-49cd-8be8-9a4076d0f6af" containerName="registry-server" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.830148 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="32851319-3003-49cd-8be8-9a4076d0f6af" containerName="registry-server" Dec 17 10:05:37 crc kubenswrapper[4935]: E1217 10:05:37.830158 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32851319-3003-49cd-8be8-9a4076d0f6af" containerName="extract-utilities" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.830164 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="32851319-3003-49cd-8be8-9a4076d0f6af" containerName="extract-utilities" Dec 17 10:05:37 crc kubenswrapper[4935]: E1217 10:05:37.830178 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36307e10-5953-420e-9627-2812d493abea" containerName="tempest-tests-tempest-tests-runner" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.830184 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="36307e10-5953-420e-9627-2812d493abea" containerName="tempest-tests-tempest-tests-runner" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.830410 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="36307e10-5953-420e-9627-2812d493abea" containerName="tempest-tests-tempest-tests-runner" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.830427 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="32851319-3003-49cd-8be8-9a4076d0f6af" containerName="registry-server" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.830439 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eeedd08-da84-40ae-97ff-e19a5d3c67bd" containerName="registry-server" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.831067 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.833798 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-45q57" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.838166 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.868107 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t99r6\" (UniqueName: \"kubernetes.io/projected/2575fc30-0353-4349-9f66-5915130b3e06-kube-api-access-t99r6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2575fc30-0353-4349-9f66-5915130b3e06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.868259 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2575fc30-0353-4349-9f66-5915130b3e06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.969938 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2575fc30-0353-4349-9f66-5915130b3e06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.970058 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t99r6\" (UniqueName: \"kubernetes.io/projected/2575fc30-0353-4349-9f66-5915130b3e06-kube-api-access-t99r6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2575fc30-0353-4349-9f66-5915130b3e06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 17 10:05:37 crc kubenswrapper[4935]: I1217 10:05:37.970453 4935 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2575fc30-0353-4349-9f66-5915130b3e06\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 17 10:05:38 crc kubenswrapper[4935]: I1217 10:05:38.001365 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t99r6\" (UniqueName: \"kubernetes.io/projected/2575fc30-0353-4349-9f66-5915130b3e06-kube-api-access-t99r6\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2575fc30-0353-4349-9f66-5915130b3e06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 17 10:05:38 crc kubenswrapper[4935]: I1217 10:05:38.004238 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"2575fc30-0353-4349-9f66-5915130b3e06\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 17 10:05:38 crc kubenswrapper[4935]: I1217 10:05:38.152842 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 17 10:05:38 crc kubenswrapper[4935]: I1217 10:05:38.590237 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 17 10:05:39 crc kubenswrapper[4935]: I1217 10:05:39.068259 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2575fc30-0353-4349-9f66-5915130b3e06","Type":"ContainerStarted","Data":"dc5369e881e75fa94e188d6853ca5f5743aeecf1c4cf9cfc2f8a3a54eaa4312b"} Dec 17 10:05:40 crc kubenswrapper[4935]: I1217 10:05:40.078855 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"2575fc30-0353-4349-9f66-5915130b3e06","Type":"ContainerStarted","Data":"bef53642cba8adefaabc6685f5e80d690912070f96ffad8f41c4d73feb14e89e"} Dec 17 10:05:40 crc kubenswrapper[4935]: I1217 10:05:40.092976 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.265316266 podStartE2EDuration="3.092957074s" podCreationTimestamp="2025-12-17 10:05:37 +0000 UTC" firstStartedPulling="2025-12-17 10:05:38.597351596 +0000 UTC m=+3658.257192359" lastFinishedPulling="2025-12-17 10:05:39.424992404 +0000 UTC m=+3659.084833167" observedRunningTime="2025-12-17 10:05:40.090386101 +0000 UTC m=+3659.750226874" watchObservedRunningTime="2025-12-17 10:05:40.092957074 +0000 UTC m=+3659.752797847" Dec 17 10:06:00 crc kubenswrapper[4935]: I1217 10:06:00.130132 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:06:00 crc kubenswrapper[4935]: I1217 10:06:00.130881 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:06:00 crc kubenswrapper[4935]: I1217 10:06:00.130915 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 10:06:00 crc kubenswrapper[4935]: I1217 10:06:00.131389 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 10:06:00 crc kubenswrapper[4935]: I1217 10:06:00.131443 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" gracePeriod=600 Dec 17 10:06:00 crc kubenswrapper[4935]: E1217 10:06:00.255264 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:06:00 crc kubenswrapper[4935]: I1217 10:06:00.271707 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" exitCode=0 Dec 17 10:06:00 crc kubenswrapper[4935]: I1217 10:06:00.271795 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827"} Dec 17 10:06:00 crc kubenswrapper[4935]: I1217 10:06:00.271923 4935 scope.go:117] "RemoveContainer" containerID="3fedab7535056f58f455e566301bd7d5ef66bd45dfcef7177d9b174d3b301c2b" Dec 17 10:06:00 crc kubenswrapper[4935]: I1217 10:06:00.272930 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:06:00 crc kubenswrapper[4935]: E1217 10:06:00.273318 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.766781 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4vtt6/must-gather-tcjhj"] Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.768889 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/must-gather-tcjhj" Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.770265 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4vtt6"/"default-dockercfg-969h5" Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.770635 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4vtt6"/"openshift-service-ca.crt" Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.774659 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4vtt6"/"kube-root-ca.crt" Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.778439 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4vtt6/must-gather-tcjhj"] Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.865155 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c52c\" (UniqueName: \"kubernetes.io/projected/3b1c768d-12e1-431d-90c5-fad6b0d5be56-kube-api-access-8c52c\") pod \"must-gather-tcjhj\" (UID: \"3b1c768d-12e1-431d-90c5-fad6b0d5be56\") " pod="openshift-must-gather-4vtt6/must-gather-tcjhj" Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.865209 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3b1c768d-12e1-431d-90c5-fad6b0d5be56-must-gather-output\") pod \"must-gather-tcjhj\" (UID: \"3b1c768d-12e1-431d-90c5-fad6b0d5be56\") " pod="openshift-must-gather-4vtt6/must-gather-tcjhj" Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.966967 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c52c\" (UniqueName: \"kubernetes.io/projected/3b1c768d-12e1-431d-90c5-fad6b0d5be56-kube-api-access-8c52c\") pod \"must-gather-tcjhj\" (UID: \"3b1c768d-12e1-431d-90c5-fad6b0d5be56\") " pod="openshift-must-gather-4vtt6/must-gather-tcjhj" Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.967454 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3b1c768d-12e1-431d-90c5-fad6b0d5be56-must-gather-output\") pod \"must-gather-tcjhj\" (UID: \"3b1c768d-12e1-431d-90c5-fad6b0d5be56\") " pod="openshift-must-gather-4vtt6/must-gather-tcjhj" Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.968014 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3b1c768d-12e1-431d-90c5-fad6b0d5be56-must-gather-output\") pod \"must-gather-tcjhj\" (UID: \"3b1c768d-12e1-431d-90c5-fad6b0d5be56\") " pod="openshift-must-gather-4vtt6/must-gather-tcjhj" Dec 17 10:06:02 crc kubenswrapper[4935]: I1217 10:06:02.990289 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c52c\" (UniqueName: \"kubernetes.io/projected/3b1c768d-12e1-431d-90c5-fad6b0d5be56-kube-api-access-8c52c\") pod \"must-gather-tcjhj\" (UID: \"3b1c768d-12e1-431d-90c5-fad6b0d5be56\") " pod="openshift-must-gather-4vtt6/must-gather-tcjhj" Dec 17 10:06:03 crc kubenswrapper[4935]: I1217 10:06:03.088579 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/must-gather-tcjhj" Dec 17 10:06:03 crc kubenswrapper[4935]: I1217 10:06:03.576357 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4vtt6/must-gather-tcjhj"] Dec 17 10:06:04 crc kubenswrapper[4935]: I1217 10:06:04.325205 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/must-gather-tcjhj" event={"ID":"3b1c768d-12e1-431d-90c5-fad6b0d5be56","Type":"ContainerStarted","Data":"8c698254f7cb5b92b7fe50ae49e0cd9fcac58f18706a3368a8f07364684431a9"} Dec 17 10:06:10 crc kubenswrapper[4935]: I1217 10:06:10.396752 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/must-gather-tcjhj" event={"ID":"3b1c768d-12e1-431d-90c5-fad6b0d5be56","Type":"ContainerStarted","Data":"0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec"} Dec 17 10:06:10 crc kubenswrapper[4935]: I1217 10:06:10.397429 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/must-gather-tcjhj" event={"ID":"3b1c768d-12e1-431d-90c5-fad6b0d5be56","Type":"ContainerStarted","Data":"bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd"} Dec 17 10:06:10 crc kubenswrapper[4935]: I1217 10:06:10.416015 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4vtt6/must-gather-tcjhj" podStartSLOduration=2.603454884 podStartE2EDuration="8.41599164s" podCreationTimestamp="2025-12-17 10:06:02 +0000 UTC" firstStartedPulling="2025-12-17 10:06:03.57989627 +0000 UTC m=+3683.239737033" lastFinishedPulling="2025-12-17 10:06:09.392433026 +0000 UTC m=+3689.052273789" observedRunningTime="2025-12-17 10:06:10.411790597 +0000 UTC m=+3690.071631360" watchObservedRunningTime="2025-12-17 10:06:10.41599164 +0000 UTC m=+3690.075832403" Dec 17 10:06:11 crc kubenswrapper[4935]: I1217 10:06:11.130613 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:06:11 crc kubenswrapper[4935]: E1217 10:06:11.131265 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:06:13 crc kubenswrapper[4935]: I1217 10:06:13.090213 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4vtt6/crc-debug-rg972"] Dec 17 10:06:13 crc kubenswrapper[4935]: I1217 10:06:13.091953 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-rg972" Dec 17 10:06:13 crc kubenswrapper[4935]: I1217 10:06:13.098189 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqmrz\" (UniqueName: \"kubernetes.io/projected/a11021d4-e311-43fd-b0c6-0bcf41714125-kube-api-access-vqmrz\") pod \"crc-debug-rg972\" (UID: \"a11021d4-e311-43fd-b0c6-0bcf41714125\") " pod="openshift-must-gather-4vtt6/crc-debug-rg972" Dec 17 10:06:13 crc kubenswrapper[4935]: I1217 10:06:13.098593 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a11021d4-e311-43fd-b0c6-0bcf41714125-host\") pod \"crc-debug-rg972\" (UID: \"a11021d4-e311-43fd-b0c6-0bcf41714125\") " pod="openshift-must-gather-4vtt6/crc-debug-rg972" Dec 17 10:06:13 crc kubenswrapper[4935]: I1217 10:06:13.200188 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqmrz\" (UniqueName: \"kubernetes.io/projected/a11021d4-e311-43fd-b0c6-0bcf41714125-kube-api-access-vqmrz\") pod \"crc-debug-rg972\" (UID: \"a11021d4-e311-43fd-b0c6-0bcf41714125\") " pod="openshift-must-gather-4vtt6/crc-debug-rg972" Dec 17 10:06:13 crc kubenswrapper[4935]: I1217 10:06:13.200265 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a11021d4-e311-43fd-b0c6-0bcf41714125-host\") pod \"crc-debug-rg972\" (UID: \"a11021d4-e311-43fd-b0c6-0bcf41714125\") " pod="openshift-must-gather-4vtt6/crc-debug-rg972" Dec 17 10:06:13 crc kubenswrapper[4935]: I1217 10:06:13.200357 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a11021d4-e311-43fd-b0c6-0bcf41714125-host\") pod \"crc-debug-rg972\" (UID: \"a11021d4-e311-43fd-b0c6-0bcf41714125\") " pod="openshift-must-gather-4vtt6/crc-debug-rg972" Dec 17 10:06:13 crc kubenswrapper[4935]: I1217 10:06:13.223939 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqmrz\" (UniqueName: \"kubernetes.io/projected/a11021d4-e311-43fd-b0c6-0bcf41714125-kube-api-access-vqmrz\") pod \"crc-debug-rg972\" (UID: \"a11021d4-e311-43fd-b0c6-0bcf41714125\") " pod="openshift-must-gather-4vtt6/crc-debug-rg972" Dec 17 10:06:13 crc kubenswrapper[4935]: I1217 10:06:13.409020 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-rg972" Dec 17 10:06:14 crc kubenswrapper[4935]: I1217 10:06:14.451788 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/crc-debug-rg972" event={"ID":"a11021d4-e311-43fd-b0c6-0bcf41714125","Type":"ContainerStarted","Data":"fd95fe867f64f72c9242cf6d3a91c4583c639b0f8794f44e042858c314af56c7"} Dec 17 10:06:22 crc kubenswrapper[4935]: I1217 10:06:22.124311 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:06:22 crc kubenswrapper[4935]: E1217 10:06:22.125251 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.558803 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kwdhc"] Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.561609 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.571119 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwdhc"] Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.697648 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-catalog-content\") pod \"redhat-marketplace-kwdhc\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.697739 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-utilities\") pod \"redhat-marketplace-kwdhc\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.697779 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9jhv\" (UniqueName: \"kubernetes.io/projected/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-kube-api-access-x9jhv\") pod \"redhat-marketplace-kwdhc\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.799459 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-catalog-content\") pod \"redhat-marketplace-kwdhc\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.799525 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-utilities\") pod \"redhat-marketplace-kwdhc\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.799549 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9jhv\" (UniqueName: \"kubernetes.io/projected/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-kube-api-access-x9jhv\") pod \"redhat-marketplace-kwdhc\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.799871 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-catalog-content\") pod \"redhat-marketplace-kwdhc\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.800087 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-utilities\") pod \"redhat-marketplace-kwdhc\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.835559 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9jhv\" (UniqueName: \"kubernetes.io/projected/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-kube-api-access-x9jhv\") pod \"redhat-marketplace-kwdhc\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:28 crc kubenswrapper[4935]: I1217 10:06:28.885193 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:29 crc kubenswrapper[4935]: E1217 10:06:29.247994 4935 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Dec 17 10:06:29 crc kubenswrapper[4935]: E1217 10:06:29.248532 4935 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqmrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-rg972_openshift-must-gather-4vtt6(a11021d4-e311-43fd-b0c6-0bcf41714125): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 17 10:06:29 crc kubenswrapper[4935]: E1217 10:06:29.249705 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-4vtt6/crc-debug-rg972" podUID="a11021d4-e311-43fd-b0c6-0bcf41714125" Dec 17 10:06:29 crc kubenswrapper[4935]: E1217 10:06:29.590532 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-4vtt6/crc-debug-rg972" podUID="a11021d4-e311-43fd-b0c6-0bcf41714125" Dec 17 10:06:29 crc kubenswrapper[4935]: I1217 10:06:29.658078 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwdhc"] Dec 17 10:06:30 crc kubenswrapper[4935]: I1217 10:06:30.598825 4935 generic.go:334] "Generic (PLEG): container finished" podID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerID="30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4" exitCode=0 Dec 17 10:06:30 crc kubenswrapper[4935]: I1217 10:06:30.598893 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwdhc" event={"ID":"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6","Type":"ContainerDied","Data":"30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4"} Dec 17 10:06:30 crc kubenswrapper[4935]: I1217 10:06:30.599110 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwdhc" event={"ID":"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6","Type":"ContainerStarted","Data":"c9fd74b429270bb3816c1595997410ad3c8fcfcf5a517cecacf5bd29479fe070"} Dec 17 10:06:31 crc kubenswrapper[4935]: I1217 10:06:31.611136 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwdhc" event={"ID":"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6","Type":"ContainerStarted","Data":"51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894"} Dec 17 10:06:32 crc kubenswrapper[4935]: I1217 10:06:32.622591 4935 generic.go:334] "Generic (PLEG): container finished" podID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerID="51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894" exitCode=0 Dec 17 10:06:32 crc kubenswrapper[4935]: I1217 10:06:32.622675 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwdhc" event={"ID":"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6","Type":"ContainerDied","Data":"51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894"} Dec 17 10:06:32 crc kubenswrapper[4935]: I1217 10:06:32.622979 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwdhc" event={"ID":"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6","Type":"ContainerStarted","Data":"b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740"} Dec 17 10:06:32 crc kubenswrapper[4935]: I1217 10:06:32.645363 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kwdhc" podStartSLOduration=3.210523044 podStartE2EDuration="4.645346815s" podCreationTimestamp="2025-12-17 10:06:28 +0000 UTC" firstStartedPulling="2025-12-17 10:06:30.600558104 +0000 UTC m=+3710.260398877" lastFinishedPulling="2025-12-17 10:06:32.035381875 +0000 UTC m=+3711.695222648" observedRunningTime="2025-12-17 10:06:32.641234644 +0000 UTC m=+3712.301075407" watchObservedRunningTime="2025-12-17 10:06:32.645346815 +0000 UTC m=+3712.305187578" Dec 17 10:06:37 crc kubenswrapper[4935]: I1217 10:06:37.124858 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:06:37 crc kubenswrapper[4935]: E1217 10:06:37.125666 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:06:38 crc kubenswrapper[4935]: I1217 10:06:38.886385 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:38 crc kubenswrapper[4935]: I1217 10:06:38.886880 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:38 crc kubenswrapper[4935]: I1217 10:06:38.936263 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:39 crc kubenswrapper[4935]: I1217 10:06:39.744672 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:39 crc kubenswrapper[4935]: I1217 10:06:39.807284 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwdhc"] Dec 17 10:06:41 crc kubenswrapper[4935]: I1217 10:06:41.700571 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kwdhc" podUID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerName="registry-server" containerID="cri-o://b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740" gracePeriod=2 Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.232080 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.356532 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-catalog-content\") pod \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.356626 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9jhv\" (UniqueName: \"kubernetes.io/projected/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-kube-api-access-x9jhv\") pod \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.356742 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-utilities\") pod \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\" (UID: \"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6\") " Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.357538 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-utilities" (OuterVolumeSpecName: "utilities") pod "c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" (UID: "c1dee1c8-a727-439b-8ca0-ab91f1ed28c6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.365466 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-kube-api-access-x9jhv" (OuterVolumeSpecName: "kube-api-access-x9jhv") pod "c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" (UID: "c1dee1c8-a727-439b-8ca0-ab91f1ed28c6"). InnerVolumeSpecName "kube-api-access-x9jhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.381722 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" (UID: "c1dee1c8-a727-439b-8ca0-ab91f1ed28c6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.462560 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.462613 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9jhv\" (UniqueName: \"kubernetes.io/projected/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-kube-api-access-x9jhv\") on node \"crc\" DevicePath \"\"" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.462630 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.715861 4935 generic.go:334] "Generic (PLEG): container finished" podID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerID="b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740" exitCode=0 Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.716000 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwdhc" event={"ID":"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6","Type":"ContainerDied","Data":"b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740"} Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.716392 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kwdhc" event={"ID":"c1dee1c8-a727-439b-8ca0-ab91f1ed28c6","Type":"ContainerDied","Data":"c9fd74b429270bb3816c1595997410ad3c8fcfcf5a517cecacf5bd29479fe070"} Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.716158 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kwdhc" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.716426 4935 scope.go:117] "RemoveContainer" containerID="b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.743572 4935 scope.go:117] "RemoveContainer" containerID="51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.754040 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwdhc"] Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.762000 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kwdhc"] Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.791351 4935 scope.go:117] "RemoveContainer" containerID="30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.821082 4935 scope.go:117] "RemoveContainer" containerID="b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740" Dec 17 10:06:42 crc kubenswrapper[4935]: E1217 10:06:42.821662 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740\": container with ID starting with b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740 not found: ID does not exist" containerID="b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.821720 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740"} err="failed to get container status \"b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740\": rpc error: code = NotFound desc = could not find container \"b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740\": container with ID starting with b4a6b88f68a93378d60f60ec0bd1de5a6a8618b189f8f83efe3f865ab1e8a740 not found: ID does not exist" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.821748 4935 scope.go:117] "RemoveContainer" containerID="51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894" Dec 17 10:06:42 crc kubenswrapper[4935]: E1217 10:06:42.822305 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894\": container with ID starting with 51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894 not found: ID does not exist" containerID="51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.822341 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894"} err="failed to get container status \"51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894\": rpc error: code = NotFound desc = could not find container \"51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894\": container with ID starting with 51269f6ff61dbcc1f8e8d50e2a2df82ed6ec2df025f13c53db8e3dd713fe9894 not found: ID does not exist" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.822365 4935 scope.go:117] "RemoveContainer" containerID="30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4" Dec 17 10:06:42 crc kubenswrapper[4935]: E1217 10:06:42.822676 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4\": container with ID starting with 30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4 not found: ID does not exist" containerID="30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4" Dec 17 10:06:42 crc kubenswrapper[4935]: I1217 10:06:42.822719 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4"} err="failed to get container status \"30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4\": rpc error: code = NotFound desc = could not find container \"30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4\": container with ID starting with 30ffd53d3fb168e143219037ff5fd9d7fda876da5a6620cc3a4ff6c280f2c8a4 not found: ID does not exist" Dec 17 10:06:43 crc kubenswrapper[4935]: I1217 10:06:43.139194 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" path="/var/lib/kubelet/pods/c1dee1c8-a727-439b-8ca0-ab91f1ed28c6/volumes" Dec 17 10:06:44 crc kubenswrapper[4935]: I1217 10:06:44.737329 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/crc-debug-rg972" event={"ID":"a11021d4-e311-43fd-b0c6-0bcf41714125","Type":"ContainerStarted","Data":"1dc21379b7fc59826e5233126b237bda928766bfeeeabc920fea4a431002700d"} Dec 17 10:06:44 crc kubenswrapper[4935]: I1217 10:06:44.758485 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4vtt6/crc-debug-rg972" podStartSLOduration=1.435029667 podStartE2EDuration="31.758462101s" podCreationTimestamp="2025-12-17 10:06:13 +0000 UTC" firstStartedPulling="2025-12-17 10:06:13.457748696 +0000 UTC m=+3693.117589459" lastFinishedPulling="2025-12-17 10:06:43.78118113 +0000 UTC m=+3723.441021893" observedRunningTime="2025-12-17 10:06:44.75069029 +0000 UTC m=+3724.410531053" watchObservedRunningTime="2025-12-17 10:06:44.758462101 +0000 UTC m=+3724.418302864" Dec 17 10:06:52 crc kubenswrapper[4935]: I1217 10:06:52.124635 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:06:52 crc kubenswrapper[4935]: E1217 10:06:52.125630 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:07:03 crc kubenswrapper[4935]: I1217 10:07:03.125042 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:07:03 crc kubenswrapper[4935]: E1217 10:07:03.125912 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:07:16 crc kubenswrapper[4935]: I1217 10:07:16.124344 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:07:16 crc kubenswrapper[4935]: E1217 10:07:16.125200 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:07:22 crc kubenswrapper[4935]: I1217 10:07:22.078728 4935 generic.go:334] "Generic (PLEG): container finished" podID="a11021d4-e311-43fd-b0c6-0bcf41714125" containerID="1dc21379b7fc59826e5233126b237bda928766bfeeeabc920fea4a431002700d" exitCode=0 Dec 17 10:07:22 crc kubenswrapper[4935]: I1217 10:07:22.078793 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/crc-debug-rg972" event={"ID":"a11021d4-e311-43fd-b0c6-0bcf41714125","Type":"ContainerDied","Data":"1dc21379b7fc59826e5233126b237bda928766bfeeeabc920fea4a431002700d"} Dec 17 10:07:23 crc kubenswrapper[4935]: I1217 10:07:23.190351 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-rg972" Dec 17 10:07:23 crc kubenswrapper[4935]: I1217 10:07:23.229666 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4vtt6/crc-debug-rg972"] Dec 17 10:07:23 crc kubenswrapper[4935]: I1217 10:07:23.238025 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4vtt6/crc-debug-rg972"] Dec 17 10:07:23 crc kubenswrapper[4935]: I1217 10:07:23.305155 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqmrz\" (UniqueName: \"kubernetes.io/projected/a11021d4-e311-43fd-b0c6-0bcf41714125-kube-api-access-vqmrz\") pod \"a11021d4-e311-43fd-b0c6-0bcf41714125\" (UID: \"a11021d4-e311-43fd-b0c6-0bcf41714125\") " Dec 17 10:07:23 crc kubenswrapper[4935]: I1217 10:07:23.305327 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a11021d4-e311-43fd-b0c6-0bcf41714125-host\") pod \"a11021d4-e311-43fd-b0c6-0bcf41714125\" (UID: \"a11021d4-e311-43fd-b0c6-0bcf41714125\") " Dec 17 10:07:23 crc kubenswrapper[4935]: I1217 10:07:23.305389 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a11021d4-e311-43fd-b0c6-0bcf41714125-host" (OuterVolumeSpecName: "host") pod "a11021d4-e311-43fd-b0c6-0bcf41714125" (UID: "a11021d4-e311-43fd-b0c6-0bcf41714125"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 10:07:23 crc kubenswrapper[4935]: I1217 10:07:23.305940 4935 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a11021d4-e311-43fd-b0c6-0bcf41714125-host\") on node \"crc\" DevicePath \"\"" Dec 17 10:07:23 crc kubenswrapper[4935]: I1217 10:07:23.310605 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a11021d4-e311-43fd-b0c6-0bcf41714125-kube-api-access-vqmrz" (OuterVolumeSpecName: "kube-api-access-vqmrz") pod "a11021d4-e311-43fd-b0c6-0bcf41714125" (UID: "a11021d4-e311-43fd-b0c6-0bcf41714125"). InnerVolumeSpecName "kube-api-access-vqmrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:07:23 crc kubenswrapper[4935]: I1217 10:07:23.407498 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqmrz\" (UniqueName: \"kubernetes.io/projected/a11021d4-e311-43fd-b0c6-0bcf41714125-kube-api-access-vqmrz\") on node \"crc\" DevicePath \"\"" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.098780 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd95fe867f64f72c9242cf6d3a91c4583c639b0f8794f44e042858c314af56c7" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.098884 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-rg972" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.398741 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4vtt6/crc-debug-jf8jn"] Dec 17 10:07:24 crc kubenswrapper[4935]: E1217 10:07:24.400144 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerName="extract-utilities" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.400163 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerName="extract-utilities" Dec 17 10:07:24 crc kubenswrapper[4935]: E1217 10:07:24.400194 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerName="extract-content" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.400203 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerName="extract-content" Dec 17 10:07:24 crc kubenswrapper[4935]: E1217 10:07:24.400217 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerName="registry-server" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.400225 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerName="registry-server" Dec 17 10:07:24 crc kubenswrapper[4935]: E1217 10:07:24.400239 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a11021d4-e311-43fd-b0c6-0bcf41714125" containerName="container-00" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.400247 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="a11021d4-e311-43fd-b0c6-0bcf41714125" containerName="container-00" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.401081 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1dee1c8-a727-439b-8ca0-ab91f1ed28c6" containerName="registry-server" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.401105 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="a11021d4-e311-43fd-b0c6-0bcf41714125" containerName="container-00" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.401868 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.525391 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2j7k\" (UniqueName: \"kubernetes.io/projected/751e0917-d3a0-41a2-abad-1b8420fe1d67-kube-api-access-q2j7k\") pod \"crc-debug-jf8jn\" (UID: \"751e0917-d3a0-41a2-abad-1b8420fe1d67\") " pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.525540 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/751e0917-d3a0-41a2-abad-1b8420fe1d67-host\") pod \"crc-debug-jf8jn\" (UID: \"751e0917-d3a0-41a2-abad-1b8420fe1d67\") " pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.626574 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/751e0917-d3a0-41a2-abad-1b8420fe1d67-host\") pod \"crc-debug-jf8jn\" (UID: \"751e0917-d3a0-41a2-abad-1b8420fe1d67\") " pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.626687 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2j7k\" (UniqueName: \"kubernetes.io/projected/751e0917-d3a0-41a2-abad-1b8420fe1d67-kube-api-access-q2j7k\") pod \"crc-debug-jf8jn\" (UID: \"751e0917-d3a0-41a2-abad-1b8420fe1d67\") " pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.626696 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/751e0917-d3a0-41a2-abad-1b8420fe1d67-host\") pod \"crc-debug-jf8jn\" (UID: \"751e0917-d3a0-41a2-abad-1b8420fe1d67\") " pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.647223 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2j7k\" (UniqueName: \"kubernetes.io/projected/751e0917-d3a0-41a2-abad-1b8420fe1d67-kube-api-access-q2j7k\") pod \"crc-debug-jf8jn\" (UID: \"751e0917-d3a0-41a2-abad-1b8420fe1d67\") " pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:07:24 crc kubenswrapper[4935]: I1217 10:07:24.731585 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:07:25 crc kubenswrapper[4935]: I1217 10:07:25.108639 4935 generic.go:334] "Generic (PLEG): container finished" podID="751e0917-d3a0-41a2-abad-1b8420fe1d67" containerID="e8b93556375b736570e91ee846efb42b2cdc7d7c90a914897a6287e5594d7e34" exitCode=0 Dec 17 10:07:25 crc kubenswrapper[4935]: I1217 10:07:25.108686 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" event={"ID":"751e0917-d3a0-41a2-abad-1b8420fe1d67","Type":"ContainerDied","Data":"e8b93556375b736570e91ee846efb42b2cdc7d7c90a914897a6287e5594d7e34"} Dec 17 10:07:25 crc kubenswrapper[4935]: I1217 10:07:25.109351 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" event={"ID":"751e0917-d3a0-41a2-abad-1b8420fe1d67","Type":"ContainerStarted","Data":"babb7994220768bc536f8e98613655b01797dc8306fdb13540555f35daee8fb6"} Dec 17 10:07:25 crc kubenswrapper[4935]: I1217 10:07:25.144436 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a11021d4-e311-43fd-b0c6-0bcf41714125" path="/var/lib/kubelet/pods/a11021d4-e311-43fd-b0c6-0bcf41714125/volumes" Dec 17 10:07:25 crc kubenswrapper[4935]: I1217 10:07:25.597682 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4vtt6/crc-debug-jf8jn"] Dec 17 10:07:25 crc kubenswrapper[4935]: I1217 10:07:25.605616 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4vtt6/crc-debug-jf8jn"] Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.245718 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.359433 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/751e0917-d3a0-41a2-abad-1b8420fe1d67-host\") pod \"751e0917-d3a0-41a2-abad-1b8420fe1d67\" (UID: \"751e0917-d3a0-41a2-abad-1b8420fe1d67\") " Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.359547 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2j7k\" (UniqueName: \"kubernetes.io/projected/751e0917-d3a0-41a2-abad-1b8420fe1d67-kube-api-access-q2j7k\") pod \"751e0917-d3a0-41a2-abad-1b8420fe1d67\" (UID: \"751e0917-d3a0-41a2-abad-1b8420fe1d67\") " Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.359568 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/751e0917-d3a0-41a2-abad-1b8420fe1d67-host" (OuterVolumeSpecName: "host") pod "751e0917-d3a0-41a2-abad-1b8420fe1d67" (UID: "751e0917-d3a0-41a2-abad-1b8420fe1d67"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.360256 4935 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/751e0917-d3a0-41a2-abad-1b8420fe1d67-host\") on node \"crc\" DevicePath \"\"" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.365228 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/751e0917-d3a0-41a2-abad-1b8420fe1d67-kube-api-access-q2j7k" (OuterVolumeSpecName: "kube-api-access-q2j7k") pod "751e0917-d3a0-41a2-abad-1b8420fe1d67" (UID: "751e0917-d3a0-41a2-abad-1b8420fe1d67"). InnerVolumeSpecName "kube-api-access-q2j7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.462344 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2j7k\" (UniqueName: \"kubernetes.io/projected/751e0917-d3a0-41a2-abad-1b8420fe1d67-kube-api-access-q2j7k\") on node \"crc\" DevicePath \"\"" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.756725 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4vtt6/crc-debug-r99dp"] Dec 17 10:07:26 crc kubenswrapper[4935]: E1217 10:07:26.757118 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751e0917-d3a0-41a2-abad-1b8420fe1d67" containerName="container-00" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.757131 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="751e0917-d3a0-41a2-abad-1b8420fe1d67" containerName="container-00" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.757346 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="751e0917-d3a0-41a2-abad-1b8420fe1d67" containerName="container-00" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.758096 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-r99dp" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.870248 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-host\") pod \"crc-debug-r99dp\" (UID: \"6ec60619-4bc2-4429-8ab6-d651ba11c0e0\") " pod="openshift-must-gather-4vtt6/crc-debug-r99dp" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.870564 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjrbw\" (UniqueName: \"kubernetes.io/projected/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-kube-api-access-bjrbw\") pod \"crc-debug-r99dp\" (UID: \"6ec60619-4bc2-4429-8ab6-d651ba11c0e0\") " pod="openshift-must-gather-4vtt6/crc-debug-r99dp" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.972695 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-host\") pod \"crc-debug-r99dp\" (UID: \"6ec60619-4bc2-4429-8ab6-d651ba11c0e0\") " pod="openshift-must-gather-4vtt6/crc-debug-r99dp" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.972797 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjrbw\" (UniqueName: \"kubernetes.io/projected/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-kube-api-access-bjrbw\") pod \"crc-debug-r99dp\" (UID: \"6ec60619-4bc2-4429-8ab6-d651ba11c0e0\") " pod="openshift-must-gather-4vtt6/crc-debug-r99dp" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.972808 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-host\") pod \"crc-debug-r99dp\" (UID: \"6ec60619-4bc2-4429-8ab6-d651ba11c0e0\") " pod="openshift-must-gather-4vtt6/crc-debug-r99dp" Dec 17 10:07:26 crc kubenswrapper[4935]: I1217 10:07:26.991224 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjrbw\" (UniqueName: \"kubernetes.io/projected/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-kube-api-access-bjrbw\") pod \"crc-debug-r99dp\" (UID: \"6ec60619-4bc2-4429-8ab6-d651ba11c0e0\") " pod="openshift-must-gather-4vtt6/crc-debug-r99dp" Dec 17 10:07:27 crc kubenswrapper[4935]: I1217 10:07:27.077391 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-r99dp" Dec 17 10:07:27 crc kubenswrapper[4935]: I1217 10:07:27.134949 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:07:27 crc kubenswrapper[4935]: I1217 10:07:27.138395 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="751e0917-d3a0-41a2-abad-1b8420fe1d67" path="/var/lib/kubelet/pods/751e0917-d3a0-41a2-abad-1b8420fe1d67/volumes" Dec 17 10:07:27 crc kubenswrapper[4935]: I1217 10:07:27.139054 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/crc-debug-r99dp" event={"ID":"6ec60619-4bc2-4429-8ab6-d651ba11c0e0","Type":"ContainerStarted","Data":"32d897bec9e90cb93335175cf65e3d8d50a705d2a9b5bc68e522d1fbf53977eb"} Dec 17 10:07:27 crc kubenswrapper[4935]: I1217 10:07:27.139090 4935 scope.go:117] "RemoveContainer" containerID="e8b93556375b736570e91ee846efb42b2cdc7d7c90a914897a6287e5594d7e34" Dec 17 10:07:28 crc kubenswrapper[4935]: I1217 10:07:28.146938 4935 generic.go:334] "Generic (PLEG): container finished" podID="6ec60619-4bc2-4429-8ab6-d651ba11c0e0" containerID="3379cb07b987940a4add0f77365479473aa58cb217e2b7b0afecd61420c4aa96" exitCode=0 Dec 17 10:07:28 crc kubenswrapper[4935]: I1217 10:07:28.147053 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/crc-debug-r99dp" event={"ID":"6ec60619-4bc2-4429-8ab6-d651ba11c0e0","Type":"ContainerDied","Data":"3379cb07b987940a4add0f77365479473aa58cb217e2b7b0afecd61420c4aa96"} Dec 17 10:07:28 crc kubenswrapper[4935]: I1217 10:07:28.194637 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4vtt6/crc-debug-r99dp"] Dec 17 10:07:28 crc kubenswrapper[4935]: I1217 10:07:28.205334 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4vtt6/crc-debug-r99dp"] Dec 17 10:07:29 crc kubenswrapper[4935]: I1217 10:07:29.124466 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:07:29 crc kubenswrapper[4935]: E1217 10:07:29.124740 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:07:29 crc kubenswrapper[4935]: I1217 10:07:29.304541 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-r99dp" Dec 17 10:07:29 crc kubenswrapper[4935]: I1217 10:07:29.419726 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjrbw\" (UniqueName: \"kubernetes.io/projected/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-kube-api-access-bjrbw\") pod \"6ec60619-4bc2-4429-8ab6-d651ba11c0e0\" (UID: \"6ec60619-4bc2-4429-8ab6-d651ba11c0e0\") " Dec 17 10:07:29 crc kubenswrapper[4935]: I1217 10:07:29.420442 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-host\") pod \"6ec60619-4bc2-4429-8ab6-d651ba11c0e0\" (UID: \"6ec60619-4bc2-4429-8ab6-d651ba11c0e0\") " Dec 17 10:07:29 crc kubenswrapper[4935]: I1217 10:07:29.421012 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-host" (OuterVolumeSpecName: "host") pod "6ec60619-4bc2-4429-8ab6-d651ba11c0e0" (UID: "6ec60619-4bc2-4429-8ab6-d651ba11c0e0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 10:07:29 crc kubenswrapper[4935]: I1217 10:07:29.426577 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-kube-api-access-bjrbw" (OuterVolumeSpecName: "kube-api-access-bjrbw") pod "6ec60619-4bc2-4429-8ab6-d651ba11c0e0" (UID: "6ec60619-4bc2-4429-8ab6-d651ba11c0e0"). InnerVolumeSpecName "kube-api-access-bjrbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:07:29 crc kubenswrapper[4935]: I1217 10:07:29.522663 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjrbw\" (UniqueName: \"kubernetes.io/projected/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-kube-api-access-bjrbw\") on node \"crc\" DevicePath \"\"" Dec 17 10:07:29 crc kubenswrapper[4935]: I1217 10:07:29.522712 4935 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ec60619-4bc2-4429-8ab6-d651ba11c0e0-host\") on node \"crc\" DevicePath \"\"" Dec 17 10:07:30 crc kubenswrapper[4935]: I1217 10:07:30.178315 4935 scope.go:117] "RemoveContainer" containerID="3379cb07b987940a4add0f77365479473aa58cb217e2b7b0afecd61420c4aa96" Dec 17 10:07:30 crc kubenswrapper[4935]: I1217 10:07:30.178645 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-r99dp" Dec 17 10:07:31 crc kubenswrapper[4935]: I1217 10:07:31.136430 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec60619-4bc2-4429-8ab6-d651ba11c0e0" path="/var/lib/kubelet/pods/6ec60619-4bc2-4429-8ab6-d651ba11c0e0/volumes" Dec 17 10:07:41 crc kubenswrapper[4935]: I1217 10:07:41.131667 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:07:41 crc kubenswrapper[4935]: E1217 10:07:41.132435 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:07:43 crc kubenswrapper[4935]: I1217 10:07:43.893199 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b5f4b44fb-ktsl4_e4ef2a77-ca25-495d-a00c-f15993955019/barbican-api/0.log" Dec 17 10:07:44 crc kubenswrapper[4935]: I1217 10:07:44.705057 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b5f4b44fb-ktsl4_e4ef2a77-ca25-495d-a00c-f15993955019/barbican-api-log/0.log" Dec 17 10:07:44 crc kubenswrapper[4935]: I1217 10:07:44.727342 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f4bb4f7d-ntfxs_4798a48d-9fdb-40d8-9890-595874a05215/barbican-keystone-listener/0.log" Dec 17 10:07:44 crc kubenswrapper[4935]: I1217 10:07:44.812173 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f4bb4f7d-ntfxs_4798a48d-9fdb-40d8-9890-595874a05215/barbican-keystone-listener-log/0.log" Dec 17 10:07:44 crc kubenswrapper[4935]: I1217 10:07:44.914003 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69fff79bd9-55nbl_93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863/barbican-worker/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.004664 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69fff79bd9-55nbl_93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863/barbican-worker-log/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.135993 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz_1b3c1c73-3f87-4383-9d09-1931001f0629/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.252194 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053/ceilometer-central-agent/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.252505 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053/ceilometer-notification-agent/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.428487 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053/proxy-httpd/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.498048 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053/sg-core/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.550598 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f6a66562-ec0b-4302-8e76-a4567917d90a/cinder-api/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.623020 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f6a66562-ec0b-4302-8e76-a4567917d90a/cinder-api-log/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.770805 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7fb81e79-940d-4ba5-a10d-c22dca5377e0/cinder-scheduler/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.784641 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7fb81e79-940d-4ba5-a10d-c22dca5377e0/probe/0.log" Dec 17 10:07:45 crc kubenswrapper[4935]: I1217 10:07:45.940594 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-48fh2_a4733665-b253-4afc-b8a3-3028f3fb2892/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:46 crc kubenswrapper[4935]: I1217 10:07:46.473102 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kwrww_dd4964d0-85a5-474f-a8f5-084210467887/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:46 crc kubenswrapper[4935]: I1217 10:07:46.538831 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-rp2ht_dd89ba3b-030b-43ae-9e1a-89d1401f81cd/init/0.log" Dec 17 10:07:46 crc kubenswrapper[4935]: I1217 10:07:46.676981 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-rp2ht_dd89ba3b-030b-43ae-9e1a-89d1401f81cd/init/0.log" Dec 17 10:07:46 crc kubenswrapper[4935]: I1217 10:07:46.761627 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-rp2ht_dd89ba3b-030b-43ae-9e1a-89d1401f81cd/dnsmasq-dns/0.log" Dec 17 10:07:46 crc kubenswrapper[4935]: I1217 10:07:46.768379 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8drs6_9a7d6590-bf03-479a-a094-259dd4efafef/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:46 crc kubenswrapper[4935]: I1217 10:07:46.960628 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2ba46087-25fc-485e-b41e-7e55dbd860c6/glance-log/0.log" Dec 17 10:07:46 crc kubenswrapper[4935]: I1217 10:07:46.962896 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2ba46087-25fc-485e-b41e-7e55dbd860c6/glance-httpd/0.log" Dec 17 10:07:47 crc kubenswrapper[4935]: I1217 10:07:47.130742 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a918434d-797e-4c05-b048-8a5c5cbc18c0/glance-log/0.log" Dec 17 10:07:47 crc kubenswrapper[4935]: I1217 10:07:47.151485 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a918434d-797e-4c05-b048-8a5c5cbc18c0/glance-httpd/0.log" Dec 17 10:07:47 crc kubenswrapper[4935]: I1217 10:07:47.392850 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7bbfb6547d-64jt7_3658abd7-bc1e-4359-aa8b-011fe7189342/horizon/0.log" Dec 17 10:07:47 crc kubenswrapper[4935]: I1217 10:07:47.480074 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c_54cfb029-5b74-4da9-b0d3-0033fe2b3968/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:47 crc kubenswrapper[4935]: I1217 10:07:47.710323 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-fk9h4_a3a33180-b0e2-45f0-bcda-e3c49acfac29/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:47 crc kubenswrapper[4935]: I1217 10:07:47.711423 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7bbfb6547d-64jt7_3658abd7-bc1e-4359-aa8b-011fe7189342/horizon-log/0.log" Dec 17 10:07:47 crc kubenswrapper[4935]: I1217 10:07:47.882591 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29432761-2h8vv_6204d422-0f4a-40eb-a3ed-eb53e9220c9e/keystone-cron/0.log" Dec 17 10:07:48 crc kubenswrapper[4935]: I1217 10:07:48.017490 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e33889bd-e62d-4b7c-83b1-a2ffc878b85a/kube-state-metrics/0.log" Dec 17 10:07:48 crc kubenswrapper[4935]: I1217 10:07:48.017883 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-548ff6dcf4-brlq9_b3785d8e-a1d0-41db-81df-41ba57d019e5/keystone-api/0.log" Dec 17 10:07:48 crc kubenswrapper[4935]: I1217 10:07:48.139049 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-znsdv_61772212-3ef5-4d2a-91be-96cd39dbb4e3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:48 crc kubenswrapper[4935]: I1217 10:07:48.493719 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7c8457c6df-5qkkl_67591c00-7d49-4db4-af34-8901c57dbb0b/neutron-httpd/0.log" Dec 17 10:07:48 crc kubenswrapper[4935]: I1217 10:07:48.536296 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7c8457c6df-5qkkl_67591c00-7d49-4db4-af34-8901c57dbb0b/neutron-api/0.log" Dec 17 10:07:48 crc kubenswrapper[4935]: I1217 10:07:48.591083 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6_4454b07b-03d5-46e3-8277-232e491c91c1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:49 crc kubenswrapper[4935]: I1217 10:07:49.070064 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bbb5ad70-4355-4513-8420-a2e99ea5a3be/nova-api-log/0.log" Dec 17 10:07:49 crc kubenswrapper[4935]: I1217 10:07:49.156606 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2efc3ff0-93d5-4ec4-b843-496b06524eb0/nova-cell0-conductor-conductor/0.log" Dec 17 10:07:49 crc kubenswrapper[4935]: I1217 10:07:49.306168 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bbb5ad70-4355-4513-8420-a2e99ea5a3be/nova-api-api/0.log" Dec 17 10:07:49 crc kubenswrapper[4935]: I1217 10:07:49.453703 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_84e0cb55-6351-4230-bfd7-e89a1439df97/nova-cell1-conductor-conductor/0.log" Dec 17 10:07:49 crc kubenswrapper[4935]: I1217 10:07:49.539322 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b941dd61-25eb-4443-a4cf-356fbe73f67b/nova-cell1-novncproxy-novncproxy/0.log" Dec 17 10:07:49 crc kubenswrapper[4935]: I1217 10:07:49.653258 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-sp6wz_e1f63f1c-eca8-4e26-ab88-07f61efb54bb/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:49 crc kubenswrapper[4935]: I1217 10:07:49.876959 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b8366a42-0c62-4527-b173-f7bfdbd2223a/nova-metadata-log/0.log" Dec 17 10:07:50 crc kubenswrapper[4935]: I1217 10:07:50.049774 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9a0706ae-c551-44cd-8fa7-ac4e2da28664/nova-scheduler-scheduler/0.log" Dec 17 10:07:50 crc kubenswrapper[4935]: I1217 10:07:50.168596 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_594360f1-b2e1-4e64-82d5-f6e471cd6850/mysql-bootstrap/0.log" Dec 17 10:07:50 crc kubenswrapper[4935]: I1217 10:07:50.590979 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_594360f1-b2e1-4e64-82d5-f6e471cd6850/galera/0.log" Dec 17 10:07:50 crc kubenswrapper[4935]: I1217 10:07:50.638875 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_594360f1-b2e1-4e64-82d5-f6e471cd6850/mysql-bootstrap/0.log" Dec 17 10:07:50 crc kubenswrapper[4935]: I1217 10:07:50.817848 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d3d1b63f-c619-4973-bad8-c90d12ccbbe1/mysql-bootstrap/0.log" Dec 17 10:07:50 crc kubenswrapper[4935]: I1217 10:07:50.996224 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b8366a42-0c62-4527-b173-f7bfdbd2223a/nova-metadata-metadata/0.log" Dec 17 10:07:51 crc kubenswrapper[4935]: I1217 10:07:51.028360 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d3d1b63f-c619-4973-bad8-c90d12ccbbe1/galera/0.log" Dec 17 10:07:51 crc kubenswrapper[4935]: I1217 10:07:51.055655 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d3d1b63f-c619-4973-bad8-c90d12ccbbe1/mysql-bootstrap/0.log" Dec 17 10:07:51 crc kubenswrapper[4935]: I1217 10:07:51.230562 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0/openstackclient/0.log" Dec 17 10:07:51 crc kubenswrapper[4935]: I1217 10:07:51.333717 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-l4zph_c1e01a1b-9baa-4738-8e44-b206863b4d3d/ovn-controller/0.log" Dec 17 10:07:51 crc kubenswrapper[4935]: I1217 10:07:51.454160 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-48lfw_7aab2b87-c484-40c4-a3c5-652f874476b2/openstack-network-exporter/0.log" Dec 17 10:07:51 crc kubenswrapper[4935]: I1217 10:07:51.580052 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jhrvs_dcc296df-d8ab-4200-ae5f-3ecf58c614b1/ovsdb-server-init/0.log" Dec 17 10:07:51 crc kubenswrapper[4935]: I1217 10:07:51.849967 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jhrvs_dcc296df-d8ab-4200-ae5f-3ecf58c614b1/ovs-vswitchd/0.log" Dec 17 10:07:51 crc kubenswrapper[4935]: I1217 10:07:51.874163 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jhrvs_dcc296df-d8ab-4200-ae5f-3ecf58c614b1/ovsdb-server-init/0.log" Dec 17 10:07:51 crc kubenswrapper[4935]: I1217 10:07:51.920557 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jhrvs_dcc296df-d8ab-4200-ae5f-3ecf58c614b1/ovsdb-server/0.log" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.115893 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-z6j8v_e59ff9f5-6277-4150-9d1b-91d323743ab8/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.124085 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:07:52 crc kubenswrapper[4935]: E1217 10:07:52.124813 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.141664 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d36637f4-f52a-47af-8f2d-439f62b55b8d/openstack-network-exporter/0.log" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.142488 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d36637f4-f52a-47af-8f2d-439f62b55b8d/ovn-northd/0.log" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.328145 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a4f8af4b-aed3-46a8-84a0-aeae265a1309/openstack-network-exporter/0.log" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.357920 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a4f8af4b-aed3-46a8-84a0-aeae265a1309/ovsdbserver-nb/0.log" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.557045 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dca41fc9-68e2-4f42-88fe-942695deca13/ovsdbserver-sb/0.log" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.593548 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dca41fc9-68e2-4f42-88fe-942695deca13/openstack-network-exporter/0.log" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.695410 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5fc9d9db96-gh2fm_f4511340-5f20-486c-b7c3-2e4b04f60a14/placement-api/0.log" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.848289 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2fa8407-e7ea-46c7-8144-87caba8e2f45/setup-container/0.log" Dec 17 10:07:52 crc kubenswrapper[4935]: I1217 10:07:52.940101 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5fc9d9db96-gh2fm_f4511340-5f20-486c-b7c3-2e4b04f60a14/placement-log/0.log" Dec 17 10:07:53 crc kubenswrapper[4935]: I1217 10:07:53.097729 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2fa8407-e7ea-46c7-8144-87caba8e2f45/setup-container/0.log" Dec 17 10:07:53 crc kubenswrapper[4935]: I1217 10:07:53.452982 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c25b888a-54e1-47f0-8932-fa07c02f30a1/setup-container/0.log" Dec 17 10:07:53 crc kubenswrapper[4935]: I1217 10:07:53.466232 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2fa8407-e7ea-46c7-8144-87caba8e2f45/rabbitmq/0.log" Dec 17 10:07:53 crc kubenswrapper[4935]: I1217 10:07:53.526148 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c25b888a-54e1-47f0-8932-fa07c02f30a1/setup-container/0.log" Dec 17 10:07:53 crc kubenswrapper[4935]: I1217 10:07:53.649318 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c25b888a-54e1-47f0-8932-fa07c02f30a1/rabbitmq/0.log" Dec 17 10:07:53 crc kubenswrapper[4935]: I1217 10:07:53.748729 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86_63b4de7b-7933-4eba-8248-5ef0db9caa3e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:53 crc kubenswrapper[4935]: I1217 10:07:53.887915 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-m9znx_d34c1a27-0426-4b46-bf51-77110a3929cd/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:53 crc kubenswrapper[4935]: I1217 10:07:53.989798 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-vs284_5ce01088-f49b-44be-b4a9-08cd183488de/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:54 crc kubenswrapper[4935]: I1217 10:07:54.171507 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8cgmz_c2da33ef-f139-4ce2-9ef3-2a15cefcf653/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:54 crc kubenswrapper[4935]: I1217 10:07:54.249525 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-d6l7r_aa726742-b847-49f9-8c0b-5814e42e1c66/ssh-known-hosts-edpm-deployment/0.log" Dec 17 10:07:54 crc kubenswrapper[4935]: I1217 10:07:54.538893 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-767f646f97-whvbb_562d41ed-7767-48a4-9cf4-84405e6deb48/proxy-httpd/0.log" Dec 17 10:07:54 crc kubenswrapper[4935]: I1217 10:07:54.553988 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-767f646f97-whvbb_562d41ed-7767-48a4-9cf4-84405e6deb48/proxy-server/0.log" Dec 17 10:07:54 crc kubenswrapper[4935]: I1217 10:07:54.626589 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-jdzl5_499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7/swift-ring-rebalance/0.log" Dec 17 10:07:54 crc kubenswrapper[4935]: I1217 10:07:54.800974 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/account-reaper/0.log" Dec 17 10:07:54 crc kubenswrapper[4935]: I1217 10:07:54.849132 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/account-auditor/0.log" Dec 17 10:07:54 crc kubenswrapper[4935]: I1217 10:07:54.872954 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/account-replicator/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.013828 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/account-server/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.082112 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/container-auditor/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.094712 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/container-replicator/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.099543 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/container-server/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.255562 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/container-updater/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.328170 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/object-expirer/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.346647 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/object-auditor/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.414847 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/object-replicator/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.480652 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/object-server/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.555307 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/object-updater/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.556842 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/rsync/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.636490 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/swift-recon-cron/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.855862 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_36307e10-5953-420e-9627-2812d493abea/tempest-tests-tempest-tests-runner/0.log" Dec 17 10:07:55 crc kubenswrapper[4935]: I1217 10:07:55.880694 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n_ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:56 crc kubenswrapper[4935]: I1217 10:07:56.053722 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2575fc30-0353-4349-9f66-5915130b3e06/test-operator-logs-container/0.log" Dec 17 10:07:56 crc kubenswrapper[4935]: I1217 10:07:56.194370 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt_2fabe8c0-2434-481a-9609-03c9ba3c30d5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:07:57 crc kubenswrapper[4935]: I1217 10:07:57.227675 4935 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod751e0917-d3a0-41a2-abad-1b8420fe1d67"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod751e0917-d3a0-41a2-abad-1b8420fe1d67] : Timed out while waiting for systemd to remove kubepods-besteffort-pod751e0917_d3a0_41a2_abad_1b8420fe1d67.slice" Dec 17 10:07:57 crc kubenswrapper[4935]: E1217 10:07:57.227764 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod751e0917-d3a0-41a2-abad-1b8420fe1d67] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod751e0917-d3a0-41a2-abad-1b8420fe1d67] : Timed out while waiting for systemd to remove kubepods-besteffort-pod751e0917_d3a0_41a2_abad_1b8420fe1d67.slice" pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" podUID="751e0917-d3a0-41a2-abad-1b8420fe1d67" Dec 17 10:07:57 crc kubenswrapper[4935]: I1217 10:07:57.494611 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/crc-debug-jf8jn" Dec 17 10:08:04 crc kubenswrapper[4935]: I1217 10:08:04.124034 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:08:04 crc kubenswrapper[4935]: E1217 10:08:04.125081 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:08:05 crc kubenswrapper[4935]: I1217 10:08:05.237201 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f4019f7f-3fa3-4d4d-976b-b81f43530f0e/memcached/0.log" Dec 17 10:08:16 crc kubenswrapper[4935]: I1217 10:08:16.125708 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:08:16 crc kubenswrapper[4935]: E1217 10:08:16.127090 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:08:21 crc kubenswrapper[4935]: I1217 10:08:21.658213 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/util/0.log" Dec 17 10:08:21 crc kubenswrapper[4935]: I1217 10:08:21.899714 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/util/0.log" Dec 17 10:08:21 crc kubenswrapper[4935]: I1217 10:08:21.913992 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/pull/0.log" Dec 17 10:08:21 crc kubenswrapper[4935]: I1217 10:08:21.957517 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/pull/0.log" Dec 17 10:08:22 crc kubenswrapper[4935]: I1217 10:08:22.098649 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/util/0.log" Dec 17 10:08:22 crc kubenswrapper[4935]: I1217 10:08:22.147995 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/pull/0.log" Dec 17 10:08:22 crc kubenswrapper[4935]: I1217 10:08:22.182584 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/extract/0.log" Dec 17 10:08:22 crc kubenswrapper[4935]: I1217 10:08:22.350723 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-95949466-vzv2z_ff77991e-cda3-4547-878f-9a2785b3a9ab/manager/0.log" Dec 17 10:08:22 crc kubenswrapper[4935]: I1217 10:08:22.437903 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-9dv4l_83f649ce-a0cd-4405-9d8f-dee381d6f85a/manager/0.log" Dec 17 10:08:22 crc kubenswrapper[4935]: I1217 10:08:22.549141 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-2t5dm_b7549908-f751-4d15-bac6-e8ebcb550a55/manager/0.log" Dec 17 10:08:22 crc kubenswrapper[4935]: I1217 10:08:22.731569 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-767f9d7567-94b2l_97f9414d-21a1-41dd-a4b0-cccffa57d46a/manager/0.log" Dec 17 10:08:22 crc kubenswrapper[4935]: I1217 10:08:22.790155 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-59b8dcb766-vwxjb_48d7ed73-e01d-48c1-98a4-22c4b3af76e3/manager/0.log" Dec 17 10:08:22 crc kubenswrapper[4935]: I1217 10:08:22.921163 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccf486b9-6jdgb_dec986d2-14bc-4419-8e9c-9a6f4b1959d2/manager/0.log" Dec 17 10:08:23 crc kubenswrapper[4935]: I1217 10:08:23.145979 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-tq9rr_8e14339f-174c-4065-8021-a3a8e56b7e16/manager/0.log" Dec 17 10:08:23 crc kubenswrapper[4935]: I1217 10:08:23.195915 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84b495f78-4g8bl_58b1e21c-930d-4c0c-9469-3e37fd64b23d/manager/0.log" Dec 17 10:08:23 crc kubenswrapper[4935]: I1217 10:08:23.416473 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5c7cbf548f-f9sfx_9c887d00-51f3-4980-9b38-45f0d53780b8/manager/0.log" Dec 17 10:08:23 crc kubenswrapper[4935]: I1217 10:08:23.447015 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fdd9786f7-dvgdt_9c2a1fd2-b473-40ad-873d-d4d9b79d5808/manager/0.log" Dec 17 10:08:23 crc kubenswrapper[4935]: I1217 10:08:23.610378 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f76f4954c-9nnjm_d8de4c04-1b17-45ca-9084-d69cd737bba2/manager/0.log" Dec 17 10:08:23 crc kubenswrapper[4935]: I1217 10:08:23.677669 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-zpm62_d7864302-210a-49dd-99ec-f33155990249/manager/0.log" Dec 17 10:08:23 crc kubenswrapper[4935]: I1217 10:08:23.917660 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-hfklk_adcbaa5e-9235-4fbd-9641-929c51d02d00/manager/0.log" Dec 17 10:08:23 crc kubenswrapper[4935]: I1217 10:08:23.974577 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-j59sf_b44d3640-477b-4ab4-b514-9e8aa8f03fa4/manager/0.log" Dec 17 10:08:24 crc kubenswrapper[4935]: I1217 10:08:24.108454 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9_361cd3f0-4302-4641-8b23-bfdb3904015f/manager/0.log" Dec 17 10:08:24 crc kubenswrapper[4935]: I1217 10:08:24.816432 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7ff595d8cc-s79xm_28ce268a-b7ac-4692-8870-063d1a26b9dc/operator/0.log" Dec 17 10:08:24 crc kubenswrapper[4935]: I1217 10:08:24.983673 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fb28k_9a5f90ee-af6e-42ff-94b9-87b969461bee/registry-server/0.log" Dec 17 10:08:25 crc kubenswrapper[4935]: I1217 10:08:25.096847 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-5zhxw_de693205-2ea2-4b43-aefc-5d4dbc8650d9/manager/0.log" Dec 17 10:08:25 crc kubenswrapper[4935]: I1217 10:08:25.196619 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8665b56d78-ndj2g_08ebe1c7-8852-4b51-8042-cd2b26a5cf50/manager/0.log" Dec 17 10:08:25 crc kubenswrapper[4935]: I1217 10:08:25.436366 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5c897cfd74-2kxtc_cca50749-e7c9-4310-aaec-873208df4579/manager/0.log" Dec 17 10:08:25 crc kubenswrapper[4935]: I1217 10:08:25.463002 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xlq6d_80038f1d-56db-4e70-91cf-3cec348298cc/operator/0.log" Dec 17 10:08:25 crc kubenswrapper[4935]: I1217 10:08:25.592772 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c6df8f9-rk9ml_f39def3f-b302-4d51-a636-752b4d23ded0/manager/0.log" Dec 17 10:08:25 crc kubenswrapper[4935]: I1217 10:08:25.706202 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-97d456b9-2kr5x_bfc47d4c-a4db-4e06-ae0b-40afebb7c42e/manager/0.log" Dec 17 10:08:25 crc kubenswrapper[4935]: I1217 10:08:25.817839 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-mnxj7_e782899f-29fc-40c3-b7ef-41bfc66a221f/manager/0.log" Dec 17 10:08:25 crc kubenswrapper[4935]: I1217 10:08:25.984329 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55f78b7c4c-6cvpq_3f54488f-6b5c-458d-be0c-19b9248cc7b1/manager/0.log" Dec 17 10:08:27 crc kubenswrapper[4935]: I1217 10:08:27.124626 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:08:27 crc kubenswrapper[4935]: E1217 10:08:27.125309 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:08:41 crc kubenswrapper[4935]: I1217 10:08:41.132162 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:08:41 crc kubenswrapper[4935]: E1217 10:08:41.132973 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:08:45 crc kubenswrapper[4935]: I1217 10:08:45.191918 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-d7j8g_84312d40-3400-410d-9ba1-952f8ffbd442/control-plane-machine-set-operator/0.log" Dec 17 10:08:45 crc kubenswrapper[4935]: I1217 10:08:45.413903 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b2nl7_26bdd534-df87-4879-b036-377d8c606d5c/kube-rbac-proxy/0.log" Dec 17 10:08:45 crc kubenswrapper[4935]: I1217 10:08:45.418793 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b2nl7_26bdd534-df87-4879-b036-377d8c606d5c/machine-api-operator/0.log" Dec 17 10:08:54 crc kubenswrapper[4935]: I1217 10:08:54.124537 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:08:54 crc kubenswrapper[4935]: E1217 10:08:54.125293 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:08:57 crc kubenswrapper[4935]: I1217 10:08:57.753679 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-r966z_ece85099-fa51-490d-a498-cf35ec83a8ad/cert-manager-controller/0.log" Dec 17 10:08:57 crc kubenswrapper[4935]: I1217 10:08:57.930186 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wbl2l_4d701ce2-c118-46f4-904b-5294c782ce68/cert-manager-cainjector/0.log" Dec 17 10:08:58 crc kubenswrapper[4935]: I1217 10:08:58.022443 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-5hx6v_be7a439b-efa4-4fdd-b56d-5e8d53f2ceb1/cert-manager-webhook/0.log" Dec 17 10:09:05 crc kubenswrapper[4935]: I1217 10:09:05.124520 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:09:05 crc kubenswrapper[4935]: E1217 10:09:05.126670 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:09:10 crc kubenswrapper[4935]: I1217 10:09:10.677257 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-wd26t_c10c7c6c-f801-40c4-bff9-0f7b740e662b/nmstate-console-plugin/0.log" Dec 17 10:09:10 crc kubenswrapper[4935]: I1217 10:09:10.891974 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-n6kbf_c0fd208a-4408-45b2-88f7-979bf751ada6/kube-rbac-proxy/0.log" Dec 17 10:09:10 crc kubenswrapper[4935]: I1217 10:09:10.898797 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-29rb4_ffdacd08-7751-465d-a6f2-9037a7307280/nmstate-handler/0.log" Dec 17 10:09:10 crc kubenswrapper[4935]: I1217 10:09:10.933919 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-n6kbf_c0fd208a-4408-45b2-88f7-979bf751ada6/nmstate-metrics/0.log" Dec 17 10:09:11 crc kubenswrapper[4935]: I1217 10:09:11.073203 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-7z6d4_4fe03c14-1b10-4f5d-8107-3037bf3fd42e/nmstate-operator/0.log" Dec 17 10:09:11 crc kubenswrapper[4935]: I1217 10:09:11.155388 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-whnkc_e4582acb-2858-4fab-8bc9-e8e6ee6589dd/nmstate-webhook/0.log" Dec 17 10:09:20 crc kubenswrapper[4935]: I1217 10:09:20.124787 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:09:20 crc kubenswrapper[4935]: E1217 10:09:20.125792 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.034021 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-sg456_30c162bc-0446-4ce3-a601-3fb687465161/kube-rbac-proxy/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.185978 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-sg456_30c162bc-0446-4ce3-a601-3fb687465161/controller/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.307104 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-frr-files/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.451563 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-reloader/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.467715 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-metrics/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.479866 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-frr-files/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.515906 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-reloader/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.709562 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-frr-files/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.710209 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-metrics/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.717707 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-reloader/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.748858 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-metrics/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.933798 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-frr-files/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.948007 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-metrics/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.978045 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/controller/0.log" Dec 17 10:09:26 crc kubenswrapper[4935]: I1217 10:09:26.986768 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-reloader/0.log" Dec 17 10:09:27 crc kubenswrapper[4935]: I1217 10:09:27.135108 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/frr-metrics/0.log" Dec 17 10:09:27 crc kubenswrapper[4935]: I1217 10:09:27.177592 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/kube-rbac-proxy/0.log" Dec 17 10:09:27 crc kubenswrapper[4935]: I1217 10:09:27.213750 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/kube-rbac-proxy-frr/0.log" Dec 17 10:09:27 crc kubenswrapper[4935]: I1217 10:09:27.417020 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/reloader/0.log" Dec 17 10:09:27 crc kubenswrapper[4935]: I1217 10:09:27.464572 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-7pk92_4331fce4-cd29-4cfc-90c0-45a97c6596a4/frr-k8s-webhook-server/0.log" Dec 17 10:09:27 crc kubenswrapper[4935]: I1217 10:09:27.731396 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-64d8b49b46-7fst7_d1068b00-4182-43a7-aa77-e2521de014b7/manager/0.log" Dec 17 10:09:27 crc kubenswrapper[4935]: I1217 10:09:27.920613 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z8lgk_137edfb5-6e98-4aef-8a75-bf14297a7d3d/kube-rbac-proxy/0.log" Dec 17 10:09:27 crc kubenswrapper[4935]: I1217 10:09:27.942742 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6c67684f-qpd2c_a7aeed44-1b4a-4d58-ac7f-077576b37887/webhook-server/0.log" Dec 17 10:09:28 crc kubenswrapper[4935]: I1217 10:09:28.581123 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/frr/0.log" Dec 17 10:09:28 crc kubenswrapper[4935]: I1217 10:09:28.585006 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z8lgk_137edfb5-6e98-4aef-8a75-bf14297a7d3d/speaker/0.log" Dec 17 10:09:32 crc kubenswrapper[4935]: I1217 10:09:32.124554 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:09:32 crc kubenswrapper[4935]: E1217 10:09:32.125426 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:09:41 crc kubenswrapper[4935]: I1217 10:09:41.209194 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/util/0.log" Dec 17 10:09:41 crc kubenswrapper[4935]: I1217 10:09:41.329242 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/util/0.log" Dec 17 10:09:41 crc kubenswrapper[4935]: I1217 10:09:41.332014 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/pull/0.log" Dec 17 10:09:41 crc kubenswrapper[4935]: I1217 10:09:41.375645 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/pull/0.log" Dec 17 10:09:41 crc kubenswrapper[4935]: I1217 10:09:41.847326 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/extract/0.log" Dec 17 10:09:41 crc kubenswrapper[4935]: I1217 10:09:41.857265 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/pull/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.000750 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/util/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.111186 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/util/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.279704 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/pull/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.284030 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/util/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.299778 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/pull/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.508762 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/extract/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.509154 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/pull/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.514221 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/util/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.690129 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-utilities/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.897786 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-content/0.log" Dec 17 10:09:42 crc kubenswrapper[4935]: I1217 10:09:42.948331 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-content/0.log" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.022178 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-utilities/0.log" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.124342 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:09:43 crc kubenswrapper[4935]: E1217 10:09:43.124634 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.184352 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-utilities/0.log" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.207804 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-content/0.log" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.429104 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-utilities/0.log" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.686026 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-content/0.log" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.686582 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/registry-server/0.log" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.701519 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-utilities/0.log" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.774907 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-content/0.log" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.909478 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-content/0.log" Dec 17 10:09:43 crc kubenswrapper[4935]: I1217 10:09:43.939562 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-utilities/0.log" Dec 17 10:09:44 crc kubenswrapper[4935]: I1217 10:09:44.188264 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zkk6r_f1baaa40-be04-428b-aaca-a5235d3f167e/marketplace-operator/0.log" Dec 17 10:09:44 crc kubenswrapper[4935]: I1217 10:09:44.348005 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-utilities/0.log" Dec 17 10:09:44 crc kubenswrapper[4935]: I1217 10:09:44.620597 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-utilities/0.log" Dec 17 10:09:44 crc kubenswrapper[4935]: I1217 10:09:44.636122 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-content/0.log" Dec 17 10:09:44 crc kubenswrapper[4935]: I1217 10:09:44.654763 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-content/0.log" Dec 17 10:09:44 crc kubenswrapper[4935]: I1217 10:09:44.658499 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/registry-server/0.log" Dec 17 10:09:44 crc kubenswrapper[4935]: I1217 10:09:44.897655 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-content/0.log" Dec 17 10:09:44 crc kubenswrapper[4935]: I1217 10:09:44.921145 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-utilities/0.log" Dec 17 10:09:45 crc kubenswrapper[4935]: I1217 10:09:45.001135 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/registry-server/0.log" Dec 17 10:09:45 crc kubenswrapper[4935]: I1217 10:09:45.113721 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-utilities/0.log" Dec 17 10:09:45 crc kubenswrapper[4935]: I1217 10:09:45.240782 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-utilities/0.log" Dec 17 10:09:45 crc kubenswrapper[4935]: I1217 10:09:45.259534 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-content/0.log" Dec 17 10:09:45 crc kubenswrapper[4935]: I1217 10:09:45.259723 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-content/0.log" Dec 17 10:09:45 crc kubenswrapper[4935]: I1217 10:09:45.452812 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-utilities/0.log" Dec 17 10:09:45 crc kubenswrapper[4935]: I1217 10:09:45.475064 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-content/0.log" Dec 17 10:09:45 crc kubenswrapper[4935]: I1217 10:09:45.969200 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/registry-server/0.log" Dec 17 10:09:55 crc kubenswrapper[4935]: I1217 10:09:55.124986 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:09:55 crc kubenswrapper[4935]: E1217 10:09:55.126104 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:10:07 crc kubenswrapper[4935]: I1217 10:10:07.124683 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:10:07 crc kubenswrapper[4935]: E1217 10:10:07.125537 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:10:18 crc kubenswrapper[4935]: I1217 10:10:18.125728 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:10:18 crc kubenswrapper[4935]: E1217 10:10:18.126555 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:10:30 crc kubenswrapper[4935]: I1217 10:10:30.124086 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:10:30 crc kubenswrapper[4935]: E1217 10:10:30.124977 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:10:43 crc kubenswrapper[4935]: I1217 10:10:43.128782 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:10:43 crc kubenswrapper[4935]: E1217 10:10:43.129579 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:10:54 crc kubenswrapper[4935]: I1217 10:10:54.125406 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:10:54 crc kubenswrapper[4935]: E1217 10:10:54.128637 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:11:06 crc kubenswrapper[4935]: I1217 10:11:06.125016 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:11:07 crc kubenswrapper[4935]: I1217 10:11:07.152287 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"4bb02c25d5b1e7d9da62787747756e583e4f190d1c4af3c4ac15329c4859cbdc"} Dec 17 10:11:29 crc kubenswrapper[4935]: I1217 10:11:29.356107 4935 generic.go:334] "Generic (PLEG): container finished" podID="3b1c768d-12e1-431d-90c5-fad6b0d5be56" containerID="bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd" exitCode=0 Dec 17 10:11:29 crc kubenswrapper[4935]: I1217 10:11:29.356198 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4vtt6/must-gather-tcjhj" event={"ID":"3b1c768d-12e1-431d-90c5-fad6b0d5be56","Type":"ContainerDied","Data":"bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd"} Dec 17 10:11:29 crc kubenswrapper[4935]: I1217 10:11:29.357452 4935 scope.go:117] "RemoveContainer" containerID="bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd" Dec 17 10:11:30 crc kubenswrapper[4935]: I1217 10:11:30.179848 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4vtt6_must-gather-tcjhj_3b1c768d-12e1-431d-90c5-fad6b0d5be56/gather/0.log" Dec 17 10:11:32 crc kubenswrapper[4935]: E1217 10:11:32.148387 4935 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.75:55632->38.102.83.75:36933: read tcp 38.102.83.75:55632->38.102.83.75:36933: read: connection reset by peer Dec 17 10:11:37 crc kubenswrapper[4935]: I1217 10:11:37.699066 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4vtt6/must-gather-tcjhj"] Dec 17 10:11:37 crc kubenswrapper[4935]: I1217 10:11:37.700071 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4vtt6/must-gather-tcjhj" podUID="3b1c768d-12e1-431d-90c5-fad6b0d5be56" containerName="copy" containerID="cri-o://0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec" gracePeriod=2 Dec 17 10:11:37 crc kubenswrapper[4935]: I1217 10:11:37.717207 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4vtt6/must-gather-tcjhj"] Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.141010 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4vtt6_must-gather-tcjhj_3b1c768d-12e1-431d-90c5-fad6b0d5be56/copy/0.log" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.141750 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/must-gather-tcjhj" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.203833 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3b1c768d-12e1-431d-90c5-fad6b0d5be56-must-gather-output\") pod \"3b1c768d-12e1-431d-90c5-fad6b0d5be56\" (UID: \"3b1c768d-12e1-431d-90c5-fad6b0d5be56\") " Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.204093 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c52c\" (UniqueName: \"kubernetes.io/projected/3b1c768d-12e1-431d-90c5-fad6b0d5be56-kube-api-access-8c52c\") pod \"3b1c768d-12e1-431d-90c5-fad6b0d5be56\" (UID: \"3b1c768d-12e1-431d-90c5-fad6b0d5be56\") " Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.210508 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1c768d-12e1-431d-90c5-fad6b0d5be56-kube-api-access-8c52c" (OuterVolumeSpecName: "kube-api-access-8c52c") pod "3b1c768d-12e1-431d-90c5-fad6b0d5be56" (UID: "3b1c768d-12e1-431d-90c5-fad6b0d5be56"). InnerVolumeSpecName "kube-api-access-8c52c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.305899 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c52c\" (UniqueName: \"kubernetes.io/projected/3b1c768d-12e1-431d-90c5-fad6b0d5be56-kube-api-access-8c52c\") on node \"crc\" DevicePath \"\"" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.367786 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b1c768d-12e1-431d-90c5-fad6b0d5be56-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3b1c768d-12e1-431d-90c5-fad6b0d5be56" (UID: "3b1c768d-12e1-431d-90c5-fad6b0d5be56"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.407616 4935 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3b1c768d-12e1-431d-90c5-fad6b0d5be56-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.449678 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4vtt6_must-gather-tcjhj_3b1c768d-12e1-431d-90c5-fad6b0d5be56/copy/0.log" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.451452 4935 generic.go:334] "Generic (PLEG): container finished" podID="3b1c768d-12e1-431d-90c5-fad6b0d5be56" containerID="0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec" exitCode=143 Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.451602 4935 scope.go:117] "RemoveContainer" containerID="0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.451609 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4vtt6/must-gather-tcjhj" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.475237 4935 scope.go:117] "RemoveContainer" containerID="bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.533981 4935 scope.go:117] "RemoveContainer" containerID="0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec" Dec 17 10:11:38 crc kubenswrapper[4935]: E1217 10:11:38.534437 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec\": container with ID starting with 0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec not found: ID does not exist" containerID="0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.534479 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec"} err="failed to get container status \"0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec\": rpc error: code = NotFound desc = could not find container \"0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec\": container with ID starting with 0759b228dc1e0f2443792cec9148132835a9b3223c5a296dd07f8798ec35f6ec not found: ID does not exist" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.534503 4935 scope.go:117] "RemoveContainer" containerID="bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd" Dec 17 10:11:38 crc kubenswrapper[4935]: E1217 10:11:38.534754 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd\": container with ID starting with bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd not found: ID does not exist" containerID="bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd" Dec 17 10:11:38 crc kubenswrapper[4935]: I1217 10:11:38.534786 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd"} err="failed to get container status \"bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd\": rpc error: code = NotFound desc = could not find container \"bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd\": container with ID starting with bc71f3f6964720fad57bef47041dc9aeeeb1692bd22d97778ac0cfffb1da5cfd not found: ID does not exist" Dec 17 10:11:39 crc kubenswrapper[4935]: I1217 10:11:39.135523 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b1c768d-12e1-431d-90c5-fad6b0d5be56" path="/var/lib/kubelet/pods/3b1c768d-12e1-431d-90c5-fad6b0d5be56/volumes" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.240805 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7qtwh"] Dec 17 10:12:57 crc kubenswrapper[4935]: E1217 10:12:57.241989 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1c768d-12e1-431d-90c5-fad6b0d5be56" containerName="gather" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.242005 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1c768d-12e1-431d-90c5-fad6b0d5be56" containerName="gather" Dec 17 10:12:57 crc kubenswrapper[4935]: E1217 10:12:57.242032 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec60619-4bc2-4429-8ab6-d651ba11c0e0" containerName="container-00" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.242040 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec60619-4bc2-4429-8ab6-d651ba11c0e0" containerName="container-00" Dec 17 10:12:57 crc kubenswrapper[4935]: E1217 10:12:57.242053 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1c768d-12e1-431d-90c5-fad6b0d5be56" containerName="copy" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.242063 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1c768d-12e1-431d-90c5-fad6b0d5be56" containerName="copy" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.242349 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1c768d-12e1-431d-90c5-fad6b0d5be56" containerName="gather" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.242382 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec60619-4bc2-4429-8ab6-d651ba11c0e0" containerName="container-00" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.242398 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1c768d-12e1-431d-90c5-fad6b0d5be56" containerName="copy" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.244010 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.254148 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7qtwh"] Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.324842 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-catalog-content\") pod \"community-operators-7qtwh\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.324910 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpll5\" (UniqueName: \"kubernetes.io/projected/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-kube-api-access-kpll5\") pod \"community-operators-7qtwh\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.325067 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-utilities\") pod \"community-operators-7qtwh\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.427171 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpll5\" (UniqueName: \"kubernetes.io/projected/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-kube-api-access-kpll5\") pod \"community-operators-7qtwh\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.427636 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-utilities\") pod \"community-operators-7qtwh\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.427674 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-catalog-content\") pod \"community-operators-7qtwh\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.428217 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-catalog-content\") pod \"community-operators-7qtwh\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.428304 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-utilities\") pod \"community-operators-7qtwh\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.455124 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpll5\" (UniqueName: \"kubernetes.io/projected/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-kube-api-access-kpll5\") pod \"community-operators-7qtwh\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:57 crc kubenswrapper[4935]: I1217 10:12:57.565652 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:12:58 crc kubenswrapper[4935]: I1217 10:12:58.042579 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7qtwh"] Dec 17 10:12:58 crc kubenswrapper[4935]: I1217 10:12:58.164399 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qtwh" event={"ID":"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb","Type":"ContainerStarted","Data":"2447247643dc3510a2d84b62f237cb75fbcce4e7b6c4a6bbe9fab513ebc7bb79"} Dec 17 10:12:59 crc kubenswrapper[4935]: I1217 10:12:59.175598 4935 generic.go:334] "Generic (PLEG): container finished" podID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerID="88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba" exitCode=0 Dec 17 10:12:59 crc kubenswrapper[4935]: I1217 10:12:59.175700 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qtwh" event={"ID":"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb","Type":"ContainerDied","Data":"88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba"} Dec 17 10:12:59 crc kubenswrapper[4935]: I1217 10:12:59.177830 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 10:13:00 crc kubenswrapper[4935]: I1217 10:13:00.185383 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qtwh" event={"ID":"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb","Type":"ContainerStarted","Data":"35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1"} Dec 17 10:13:00 crc kubenswrapper[4935]: E1217 10:13:00.344506 4935 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2390f217_8e7c_4bbc_bb39_91fe4ce5a2bb.slice/crio-35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1.scope\": RecentStats: unable to find data in memory cache]" Dec 17 10:13:01 crc kubenswrapper[4935]: I1217 10:13:01.196155 4935 generic.go:334] "Generic (PLEG): container finished" podID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerID="35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1" exitCode=0 Dec 17 10:13:01 crc kubenswrapper[4935]: I1217 10:13:01.196197 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qtwh" event={"ID":"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb","Type":"ContainerDied","Data":"35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1"} Dec 17 10:13:02 crc kubenswrapper[4935]: I1217 10:13:02.207369 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qtwh" event={"ID":"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb","Type":"ContainerStarted","Data":"d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374"} Dec 17 10:13:02 crc kubenswrapper[4935]: I1217 10:13:02.226453 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7qtwh" podStartSLOduration=2.65398155 podStartE2EDuration="5.226432858s" podCreationTimestamp="2025-12-17 10:12:57 +0000 UTC" firstStartedPulling="2025-12-17 10:12:59.177532296 +0000 UTC m=+4098.837373059" lastFinishedPulling="2025-12-17 10:13:01.749983574 +0000 UTC m=+4101.409824367" observedRunningTime="2025-12-17 10:13:02.222210105 +0000 UTC m=+4101.882050868" watchObservedRunningTime="2025-12-17 10:13:02.226432858 +0000 UTC m=+4101.886273621" Dec 17 10:13:07 crc kubenswrapper[4935]: I1217 10:13:07.565939 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:13:07 crc kubenswrapper[4935]: I1217 10:13:07.568680 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:13:07 crc kubenswrapper[4935]: I1217 10:13:07.650045 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:13:08 crc kubenswrapper[4935]: I1217 10:13:08.331829 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:13:08 crc kubenswrapper[4935]: I1217 10:13:08.388169 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7qtwh"] Dec 17 10:13:10 crc kubenswrapper[4935]: I1217 10:13:10.318493 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7qtwh" podUID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerName="registry-server" containerID="cri-o://d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374" gracePeriod=2 Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.289132 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.307040 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-catalog-content\") pod \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.307104 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpll5\" (UniqueName: \"kubernetes.io/projected/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-kube-api-access-kpll5\") pod \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.307430 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-utilities\") pod \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\" (UID: \"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb\") " Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.308734 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-utilities" (OuterVolumeSpecName: "utilities") pod "2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" (UID: "2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.315546 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-kube-api-access-kpll5" (OuterVolumeSpecName: "kube-api-access-kpll5") pod "2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" (UID: "2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb"). InnerVolumeSpecName "kube-api-access-kpll5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.331639 4935 generic.go:334] "Generic (PLEG): container finished" podID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerID="d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374" exitCode=0 Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.331692 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qtwh" event={"ID":"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb","Type":"ContainerDied","Data":"d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374"} Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.331728 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qtwh" event={"ID":"2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb","Type":"ContainerDied","Data":"2447247643dc3510a2d84b62f237cb75fbcce4e7b6c4a6bbe9fab513ebc7bb79"} Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.331749 4935 scope.go:117] "RemoveContainer" containerID="d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.331745 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qtwh" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.368403 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" (UID: "2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.372491 4935 scope.go:117] "RemoveContainer" containerID="35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.391884 4935 scope.go:117] "RemoveContainer" containerID="88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.409982 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.410035 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.410046 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpll5\" (UniqueName: \"kubernetes.io/projected/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb-kube-api-access-kpll5\") on node \"crc\" DevicePath \"\"" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.439020 4935 scope.go:117] "RemoveContainer" containerID="d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374" Dec 17 10:13:11 crc kubenswrapper[4935]: E1217 10:13:11.439762 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374\": container with ID starting with d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374 not found: ID does not exist" containerID="d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.439897 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374"} err="failed to get container status \"d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374\": rpc error: code = NotFound desc = could not find container \"d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374\": container with ID starting with d40d4659b2a5a6a3e64bd2c20db94cee4b1434da32d6d2e6121059c8febb0374 not found: ID does not exist" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.440004 4935 scope.go:117] "RemoveContainer" containerID="35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1" Dec 17 10:13:11 crc kubenswrapper[4935]: E1217 10:13:11.440358 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1\": container with ID starting with 35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1 not found: ID does not exist" containerID="35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.440494 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1"} err="failed to get container status \"35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1\": rpc error: code = NotFound desc = could not find container \"35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1\": container with ID starting with 35a8436feb4baf5b5a907a032542a1cd85089b24ea87237ea7d69b0095d099e1 not found: ID does not exist" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.440582 4935 scope.go:117] "RemoveContainer" containerID="88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba" Dec 17 10:13:11 crc kubenswrapper[4935]: E1217 10:13:11.443492 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba\": container with ID starting with 88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba not found: ID does not exist" containerID="88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.443594 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba"} err="failed to get container status \"88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba\": rpc error: code = NotFound desc = could not find container \"88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba\": container with ID starting with 88884b769975058c79fbb02cd2d2a04149da0f13c2582e00b5ff0b8eaf7caaba not found: ID does not exist" Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.662377 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7qtwh"] Dec 17 10:13:11 crc kubenswrapper[4935]: I1217 10:13:11.670110 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7qtwh"] Dec 17 10:13:13 crc kubenswrapper[4935]: I1217 10:13:13.136802 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" path="/var/lib/kubelet/pods/2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb/volumes" Dec 17 10:13:13 crc kubenswrapper[4935]: I1217 10:13:13.921525 4935 scope.go:117] "RemoveContainer" containerID="1dc21379b7fc59826e5233126b237bda928766bfeeeabc920fea4a431002700d" Dec 17 10:13:30 crc kubenswrapper[4935]: I1217 10:13:30.138783 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:13:30 crc kubenswrapper[4935]: I1217 10:13:30.139686 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:14:00 crc kubenswrapper[4935]: I1217 10:14:00.131130 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:14:00 crc kubenswrapper[4935]: I1217 10:14:00.131797 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:14:30 crc kubenswrapper[4935]: I1217 10:14:30.130177 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:14:30 crc kubenswrapper[4935]: I1217 10:14:30.130999 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:14:30 crc kubenswrapper[4935]: I1217 10:14:30.131047 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 10:14:30 crc kubenswrapper[4935]: I1217 10:14:30.131758 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4bb02c25d5b1e7d9da62787747756e583e4f190d1c4af3c4ac15329c4859cbdc"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 10:14:30 crc kubenswrapper[4935]: I1217 10:14:30.131826 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://4bb02c25d5b1e7d9da62787747756e583e4f190d1c4af3c4ac15329c4859cbdc" gracePeriod=600 Dec 17 10:14:31 crc kubenswrapper[4935]: I1217 10:14:31.019099 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="4bb02c25d5b1e7d9da62787747756e583e4f190d1c4af3c4ac15329c4859cbdc" exitCode=0 Dec 17 10:14:31 crc kubenswrapper[4935]: I1217 10:14:31.019323 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"4bb02c25d5b1e7d9da62787747756e583e4f190d1c4af3c4ac15329c4859cbdc"} Dec 17 10:14:31 crc kubenswrapper[4935]: I1217 10:14:31.019855 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916"} Dec 17 10:14:31 crc kubenswrapper[4935]: I1217 10:14:31.019879 4935 scope.go:117] "RemoveContainer" containerID="7d4764bdb67a1763239da0b5979852ca2db58a3007e739582b63ddf09dec6827" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.235081 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kz64z/must-gather-tr6hb"] Dec 17 10:14:35 crc kubenswrapper[4935]: E1217 10:14:35.236163 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerName="extract-utilities" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.236181 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerName="extract-utilities" Dec 17 10:14:35 crc kubenswrapper[4935]: E1217 10:14:35.236204 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerName="registry-server" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.236211 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerName="registry-server" Dec 17 10:14:35 crc kubenswrapper[4935]: E1217 10:14:35.236235 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerName="extract-content" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.236243 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerName="extract-content" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.236923 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="2390f217-8e7c-4bbc-bb39-91fe4ce5a2bb" containerName="registry-server" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.238265 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/must-gather-tr6hb" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.243806 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kz64z"/"kube-root-ca.crt" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.244075 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kz64z"/"openshift-service-ca.crt" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.260866 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kz64z/must-gather-tr6hb"] Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.344818 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59bz8\" (UniqueName: \"kubernetes.io/projected/70ded1d2-623a-4398-9b8b-25f62f65eb28-kube-api-access-59bz8\") pod \"must-gather-tr6hb\" (UID: \"70ded1d2-623a-4398-9b8b-25f62f65eb28\") " pod="openshift-must-gather-kz64z/must-gather-tr6hb" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.345347 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70ded1d2-623a-4398-9b8b-25f62f65eb28-must-gather-output\") pod \"must-gather-tr6hb\" (UID: \"70ded1d2-623a-4398-9b8b-25f62f65eb28\") " pod="openshift-must-gather-kz64z/must-gather-tr6hb" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.451154 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59bz8\" (UniqueName: \"kubernetes.io/projected/70ded1d2-623a-4398-9b8b-25f62f65eb28-kube-api-access-59bz8\") pod \"must-gather-tr6hb\" (UID: \"70ded1d2-623a-4398-9b8b-25f62f65eb28\") " pod="openshift-must-gather-kz64z/must-gather-tr6hb" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.451217 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70ded1d2-623a-4398-9b8b-25f62f65eb28-must-gather-output\") pod \"must-gather-tr6hb\" (UID: \"70ded1d2-623a-4398-9b8b-25f62f65eb28\") " pod="openshift-must-gather-kz64z/must-gather-tr6hb" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.451971 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70ded1d2-623a-4398-9b8b-25f62f65eb28-must-gather-output\") pod \"must-gather-tr6hb\" (UID: \"70ded1d2-623a-4398-9b8b-25f62f65eb28\") " pod="openshift-must-gather-kz64z/must-gather-tr6hb" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.491697 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59bz8\" (UniqueName: \"kubernetes.io/projected/70ded1d2-623a-4398-9b8b-25f62f65eb28-kube-api-access-59bz8\") pod \"must-gather-tr6hb\" (UID: \"70ded1d2-623a-4398-9b8b-25f62f65eb28\") " pod="openshift-must-gather-kz64z/must-gather-tr6hb" Dec 17 10:14:35 crc kubenswrapper[4935]: I1217 10:14:35.578842 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/must-gather-tr6hb" Dec 17 10:14:36 crc kubenswrapper[4935]: I1217 10:14:36.050408 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kz64z/must-gather-tr6hb"] Dec 17 10:14:36 crc kubenswrapper[4935]: I1217 10:14:36.064823 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/must-gather-tr6hb" event={"ID":"70ded1d2-623a-4398-9b8b-25f62f65eb28","Type":"ContainerStarted","Data":"dbf746dd828365029ad218128ccfe10d05a7a753b2ef544847e23e9ed1bb8fa6"} Dec 17 10:14:37 crc kubenswrapper[4935]: I1217 10:14:37.074849 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/must-gather-tr6hb" event={"ID":"70ded1d2-623a-4398-9b8b-25f62f65eb28","Type":"ContainerStarted","Data":"760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee"} Dec 17 10:14:37 crc kubenswrapper[4935]: I1217 10:14:37.075566 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/must-gather-tr6hb" event={"ID":"70ded1d2-623a-4398-9b8b-25f62f65eb28","Type":"ContainerStarted","Data":"e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf"} Dec 17 10:14:37 crc kubenswrapper[4935]: I1217 10:14:37.091582 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kz64z/must-gather-tr6hb" podStartSLOduration=2.091565766 podStartE2EDuration="2.091565766s" podCreationTimestamp="2025-12-17 10:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 10:14:37.086629265 +0000 UTC m=+4196.746470028" watchObservedRunningTime="2025-12-17 10:14:37.091565766 +0000 UTC m=+4196.751406529" Dec 17 10:14:40 crc kubenswrapper[4935]: I1217 10:14:40.196676 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kz64z/crc-debug-5kkgq"] Dec 17 10:14:40 crc kubenswrapper[4935]: I1217 10:14:40.199365 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-5kkgq" Dec 17 10:14:40 crc kubenswrapper[4935]: I1217 10:14:40.202777 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kz64z"/"default-dockercfg-mwswf" Dec 17 10:14:40 crc kubenswrapper[4935]: I1217 10:14:40.339418 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qmw5\" (UniqueName: \"kubernetes.io/projected/53adfde5-1b63-408e-98b3-706b249fb45e-kube-api-access-8qmw5\") pod \"crc-debug-5kkgq\" (UID: \"53adfde5-1b63-408e-98b3-706b249fb45e\") " pod="openshift-must-gather-kz64z/crc-debug-5kkgq" Dec 17 10:14:40 crc kubenswrapper[4935]: I1217 10:14:40.339618 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53adfde5-1b63-408e-98b3-706b249fb45e-host\") pod \"crc-debug-5kkgq\" (UID: \"53adfde5-1b63-408e-98b3-706b249fb45e\") " pod="openshift-must-gather-kz64z/crc-debug-5kkgq" Dec 17 10:14:40 crc kubenswrapper[4935]: I1217 10:14:40.441178 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53adfde5-1b63-408e-98b3-706b249fb45e-host\") pod \"crc-debug-5kkgq\" (UID: \"53adfde5-1b63-408e-98b3-706b249fb45e\") " pod="openshift-must-gather-kz64z/crc-debug-5kkgq" Dec 17 10:14:40 crc kubenswrapper[4935]: I1217 10:14:40.441324 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qmw5\" (UniqueName: \"kubernetes.io/projected/53adfde5-1b63-408e-98b3-706b249fb45e-kube-api-access-8qmw5\") pod \"crc-debug-5kkgq\" (UID: \"53adfde5-1b63-408e-98b3-706b249fb45e\") " pod="openshift-must-gather-kz64z/crc-debug-5kkgq" Dec 17 10:14:40 crc kubenswrapper[4935]: I1217 10:14:40.441403 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53adfde5-1b63-408e-98b3-706b249fb45e-host\") pod \"crc-debug-5kkgq\" (UID: \"53adfde5-1b63-408e-98b3-706b249fb45e\") " pod="openshift-must-gather-kz64z/crc-debug-5kkgq" Dec 17 10:14:40 crc kubenswrapper[4935]: I1217 10:14:40.462196 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qmw5\" (UniqueName: \"kubernetes.io/projected/53adfde5-1b63-408e-98b3-706b249fb45e-kube-api-access-8qmw5\") pod \"crc-debug-5kkgq\" (UID: \"53adfde5-1b63-408e-98b3-706b249fb45e\") " pod="openshift-must-gather-kz64z/crc-debug-5kkgq" Dec 17 10:14:40 crc kubenswrapper[4935]: I1217 10:14:40.528036 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-5kkgq" Dec 17 10:14:41 crc kubenswrapper[4935]: I1217 10:14:41.119107 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/crc-debug-5kkgq" event={"ID":"53adfde5-1b63-408e-98b3-706b249fb45e","Type":"ContainerStarted","Data":"c38985c3373a7959dc0939f45323196b097ca8058d6516fad81d4489763e2829"} Dec 17 10:14:41 crc kubenswrapper[4935]: I1217 10:14:41.120484 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/crc-debug-5kkgq" event={"ID":"53adfde5-1b63-408e-98b3-706b249fb45e","Type":"ContainerStarted","Data":"4d1148236b6e525e561f7309284da6369e113f602d88f384560c90d811c33286"} Dec 17 10:14:41 crc kubenswrapper[4935]: I1217 10:14:41.146184 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kz64z/crc-debug-5kkgq" podStartSLOduration=1.146164296 podStartE2EDuration="1.146164296s" podCreationTimestamp="2025-12-17 10:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-17 10:14:41.141929363 +0000 UTC m=+4200.801770116" watchObservedRunningTime="2025-12-17 10:14:41.146164296 +0000 UTC m=+4200.806005059" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.178164 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr"] Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.180224 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.182974 4935 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.183126 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.201979 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr"] Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.352567 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7f5a22-7df0-4477-8a1b-18a779933de7-secret-volume\") pod \"collect-profiles-29432775-dbkdr\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.352628 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7f5a22-7df0-4477-8a1b-18a779933de7-config-volume\") pod \"collect-profiles-29432775-dbkdr\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.352868 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd6tp\" (UniqueName: \"kubernetes.io/projected/5c7f5a22-7df0-4477-8a1b-18a779933de7-kube-api-access-dd6tp\") pod \"collect-profiles-29432775-dbkdr\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.455151 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7f5a22-7df0-4477-8a1b-18a779933de7-secret-volume\") pod \"collect-profiles-29432775-dbkdr\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.455212 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7f5a22-7df0-4477-8a1b-18a779933de7-config-volume\") pod \"collect-profiles-29432775-dbkdr\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.455268 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd6tp\" (UniqueName: \"kubernetes.io/projected/5c7f5a22-7df0-4477-8a1b-18a779933de7-kube-api-access-dd6tp\") pod \"collect-profiles-29432775-dbkdr\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.456369 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7f5a22-7df0-4477-8a1b-18a779933de7-config-volume\") pod \"collect-profiles-29432775-dbkdr\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.468639 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7f5a22-7df0-4477-8a1b-18a779933de7-secret-volume\") pod \"collect-profiles-29432775-dbkdr\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.478446 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd6tp\" (UniqueName: \"kubernetes.io/projected/5c7f5a22-7df0-4477-8a1b-18a779933de7-kube-api-access-dd6tp\") pod \"collect-profiles-29432775-dbkdr\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.500659 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:00 crc kubenswrapper[4935]: I1217 10:15:00.972877 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr"] Dec 17 10:15:01 crc kubenswrapper[4935]: I1217 10:15:01.306191 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" event={"ID":"5c7f5a22-7df0-4477-8a1b-18a779933de7","Type":"ContainerStarted","Data":"035e8e849764047fe0bc9e9a72dcc5c3c2eb1696f326e1a9132503e8ea563685"} Dec 17 10:15:02 crc kubenswrapper[4935]: I1217 10:15:02.316754 4935 generic.go:334] "Generic (PLEG): container finished" podID="5c7f5a22-7df0-4477-8a1b-18a779933de7" containerID="c3284a20ea44303f2e109801d34a7c0e87c9565cda812e676cf21c87d67f7e68" exitCode=0 Dec 17 10:15:02 crc kubenswrapper[4935]: I1217 10:15:02.316798 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" event={"ID":"5c7f5a22-7df0-4477-8a1b-18a779933de7","Type":"ContainerDied","Data":"c3284a20ea44303f2e109801d34a7c0e87c9565cda812e676cf21c87d67f7e68"} Dec 17 10:15:03 crc kubenswrapper[4935]: I1217 10:15:03.706571 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:03 crc kubenswrapper[4935]: I1217 10:15:03.820516 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd6tp\" (UniqueName: \"kubernetes.io/projected/5c7f5a22-7df0-4477-8a1b-18a779933de7-kube-api-access-dd6tp\") pod \"5c7f5a22-7df0-4477-8a1b-18a779933de7\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " Dec 17 10:15:03 crc kubenswrapper[4935]: I1217 10:15:03.820719 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7f5a22-7df0-4477-8a1b-18a779933de7-config-volume\") pod \"5c7f5a22-7df0-4477-8a1b-18a779933de7\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " Dec 17 10:15:03 crc kubenswrapper[4935]: I1217 10:15:03.820842 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7f5a22-7df0-4477-8a1b-18a779933de7-secret-volume\") pod \"5c7f5a22-7df0-4477-8a1b-18a779933de7\" (UID: \"5c7f5a22-7df0-4477-8a1b-18a779933de7\") " Dec 17 10:15:03 crc kubenswrapper[4935]: I1217 10:15:03.822085 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c7f5a22-7df0-4477-8a1b-18a779933de7-config-volume" (OuterVolumeSpecName: "config-volume") pod "5c7f5a22-7df0-4477-8a1b-18a779933de7" (UID: "5c7f5a22-7df0-4477-8a1b-18a779933de7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 17 10:15:03 crc kubenswrapper[4935]: I1217 10:15:03.829607 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c7f5a22-7df0-4477-8a1b-18a779933de7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5c7f5a22-7df0-4477-8a1b-18a779933de7" (UID: "5c7f5a22-7df0-4477-8a1b-18a779933de7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 17 10:15:03 crc kubenswrapper[4935]: I1217 10:15:03.844039 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c7f5a22-7df0-4477-8a1b-18a779933de7-kube-api-access-dd6tp" (OuterVolumeSpecName: "kube-api-access-dd6tp") pod "5c7f5a22-7df0-4477-8a1b-18a779933de7" (UID: "5c7f5a22-7df0-4477-8a1b-18a779933de7"). InnerVolumeSpecName "kube-api-access-dd6tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:15:03 crc kubenswrapper[4935]: I1217 10:15:03.922941 4935 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c7f5a22-7df0-4477-8a1b-18a779933de7-config-volume\") on node \"crc\" DevicePath \"\"" Dec 17 10:15:03 crc kubenswrapper[4935]: I1217 10:15:03.923578 4935 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c7f5a22-7df0-4477-8a1b-18a779933de7-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 17 10:15:03 crc kubenswrapper[4935]: I1217 10:15:03.923995 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd6tp\" (UniqueName: \"kubernetes.io/projected/5c7f5a22-7df0-4477-8a1b-18a779933de7-kube-api-access-dd6tp\") on node \"crc\" DevicePath \"\"" Dec 17 10:15:04 crc kubenswrapper[4935]: I1217 10:15:04.335654 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" event={"ID":"5c7f5a22-7df0-4477-8a1b-18a779933de7","Type":"ContainerDied","Data":"035e8e849764047fe0bc9e9a72dcc5c3c2eb1696f326e1a9132503e8ea563685"} Dec 17 10:15:04 crc kubenswrapper[4935]: I1217 10:15:04.335695 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29432775-dbkdr" Dec 17 10:15:04 crc kubenswrapper[4935]: I1217 10:15:04.335712 4935 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035e8e849764047fe0bc9e9a72dcc5c3c2eb1696f326e1a9132503e8ea563685" Dec 17 10:15:04 crc kubenswrapper[4935]: I1217 10:15:04.811948 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d"] Dec 17 10:15:04 crc kubenswrapper[4935]: I1217 10:15:04.821260 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29432730-9tg7d"] Dec 17 10:15:05 crc kubenswrapper[4935]: I1217 10:15:05.148633 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2bb91fe-0bf5-4422-a58c-0677652f6baa" path="/var/lib/kubelet/pods/b2bb91fe-0bf5-4422-a58c-0677652f6baa/volumes" Dec 17 10:15:14 crc kubenswrapper[4935]: I1217 10:15:14.003724 4935 scope.go:117] "RemoveContainer" containerID="40815833fa0990eb933830d7c29c34fdf89f68c098f6d02445d74910128e0333" Dec 17 10:15:15 crc kubenswrapper[4935]: I1217 10:15:15.431151 4935 generic.go:334] "Generic (PLEG): container finished" podID="53adfde5-1b63-408e-98b3-706b249fb45e" containerID="c38985c3373a7959dc0939f45323196b097ca8058d6516fad81d4489763e2829" exitCode=0 Dec 17 10:15:15 crc kubenswrapper[4935]: I1217 10:15:15.431214 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/crc-debug-5kkgq" event={"ID":"53adfde5-1b63-408e-98b3-706b249fb45e","Type":"ContainerDied","Data":"c38985c3373a7959dc0939f45323196b097ca8058d6516fad81d4489763e2829"} Dec 17 10:15:16 crc kubenswrapper[4935]: I1217 10:15:16.708564 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-5kkgq" Dec 17 10:15:16 crc kubenswrapper[4935]: I1217 10:15:16.741591 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kz64z/crc-debug-5kkgq"] Dec 17 10:15:16 crc kubenswrapper[4935]: I1217 10:15:16.750654 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kz64z/crc-debug-5kkgq"] Dec 17 10:15:16 crc kubenswrapper[4935]: I1217 10:15:16.873452 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qmw5\" (UniqueName: \"kubernetes.io/projected/53adfde5-1b63-408e-98b3-706b249fb45e-kube-api-access-8qmw5\") pod \"53adfde5-1b63-408e-98b3-706b249fb45e\" (UID: \"53adfde5-1b63-408e-98b3-706b249fb45e\") " Dec 17 10:15:16 crc kubenswrapper[4935]: I1217 10:15:16.873972 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53adfde5-1b63-408e-98b3-706b249fb45e-host\") pod \"53adfde5-1b63-408e-98b3-706b249fb45e\" (UID: \"53adfde5-1b63-408e-98b3-706b249fb45e\") " Dec 17 10:15:16 crc kubenswrapper[4935]: I1217 10:15:16.874089 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53adfde5-1b63-408e-98b3-706b249fb45e-host" (OuterVolumeSpecName: "host") pod "53adfde5-1b63-408e-98b3-706b249fb45e" (UID: "53adfde5-1b63-408e-98b3-706b249fb45e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 10:15:16 crc kubenswrapper[4935]: I1217 10:15:16.874461 4935 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53adfde5-1b63-408e-98b3-706b249fb45e-host\") on node \"crc\" DevicePath \"\"" Dec 17 10:15:16 crc kubenswrapper[4935]: I1217 10:15:16.881526 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53adfde5-1b63-408e-98b3-706b249fb45e-kube-api-access-8qmw5" (OuterVolumeSpecName: "kube-api-access-8qmw5") pod "53adfde5-1b63-408e-98b3-706b249fb45e" (UID: "53adfde5-1b63-408e-98b3-706b249fb45e"). InnerVolumeSpecName "kube-api-access-8qmw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:15:16 crc kubenswrapper[4935]: I1217 10:15:16.976467 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qmw5\" (UniqueName: \"kubernetes.io/projected/53adfde5-1b63-408e-98b3-706b249fb45e-kube-api-access-8qmw5\") on node \"crc\" DevicePath \"\"" Dec 17 10:15:17 crc kubenswrapper[4935]: I1217 10:15:17.135494 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53adfde5-1b63-408e-98b3-706b249fb45e" path="/var/lib/kubelet/pods/53adfde5-1b63-408e-98b3-706b249fb45e/volumes" Dec 17 10:15:17 crc kubenswrapper[4935]: I1217 10:15:17.448095 4935 scope.go:117] "RemoveContainer" containerID="c38985c3373a7959dc0939f45323196b097ca8058d6516fad81d4489763e2829" Dec 17 10:15:17 crc kubenswrapper[4935]: I1217 10:15:17.448143 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-5kkgq" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.004290 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kz64z/crc-debug-pbkt5"] Dec 17 10:15:18 crc kubenswrapper[4935]: E1217 10:15:18.004770 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c7f5a22-7df0-4477-8a1b-18a779933de7" containerName="collect-profiles" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.004789 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c7f5a22-7df0-4477-8a1b-18a779933de7" containerName="collect-profiles" Dec 17 10:15:18 crc kubenswrapper[4935]: E1217 10:15:18.004812 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53adfde5-1b63-408e-98b3-706b249fb45e" containerName="container-00" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.004821 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="53adfde5-1b63-408e-98b3-706b249fb45e" containerName="container-00" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.005072 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c7f5a22-7df0-4477-8a1b-18a779933de7" containerName="collect-profiles" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.005096 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="53adfde5-1b63-408e-98b3-706b249fb45e" containerName="container-00" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.005932 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-pbkt5" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.008554 4935 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-kz64z"/"default-dockercfg-mwswf" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.118572 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-host\") pod \"crc-debug-pbkt5\" (UID: \"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f\") " pod="openshift-must-gather-kz64z/crc-debug-pbkt5" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.118636 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdr8\" (UniqueName: \"kubernetes.io/projected/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-kube-api-access-thdr8\") pod \"crc-debug-pbkt5\" (UID: \"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f\") " pod="openshift-must-gather-kz64z/crc-debug-pbkt5" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.220220 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-host\") pod \"crc-debug-pbkt5\" (UID: \"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f\") " pod="openshift-must-gather-kz64z/crc-debug-pbkt5" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.220312 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdr8\" (UniqueName: \"kubernetes.io/projected/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-kube-api-access-thdr8\") pod \"crc-debug-pbkt5\" (UID: \"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f\") " pod="openshift-must-gather-kz64z/crc-debug-pbkt5" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.220353 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-host\") pod \"crc-debug-pbkt5\" (UID: \"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f\") " pod="openshift-must-gather-kz64z/crc-debug-pbkt5" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.238981 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdr8\" (UniqueName: \"kubernetes.io/projected/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-kube-api-access-thdr8\") pod \"crc-debug-pbkt5\" (UID: \"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f\") " pod="openshift-must-gather-kz64z/crc-debug-pbkt5" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.322192 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-pbkt5" Dec 17 10:15:18 crc kubenswrapper[4935]: I1217 10:15:18.463621 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/crc-debug-pbkt5" event={"ID":"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f","Type":"ContainerStarted","Data":"8dc39c5d6c6a994620b775f79c404ced7f1282429a1dd510a9479b78493384fa"} Dec 17 10:15:19 crc kubenswrapper[4935]: I1217 10:15:19.474764 4935 generic.go:334] "Generic (PLEG): container finished" podID="cf24d0d4-91e2-42b2-bf28-d085d4c8b82f" containerID="1df9e514a6e83f8a1335d28536280795707884a3792bd76b14373ee7394fe178" exitCode=0 Dec 17 10:15:19 crc kubenswrapper[4935]: I1217 10:15:19.474827 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/crc-debug-pbkt5" event={"ID":"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f","Type":"ContainerDied","Data":"1df9e514a6e83f8a1335d28536280795707884a3792bd76b14373ee7394fe178"} Dec 17 10:15:20 crc kubenswrapper[4935]: I1217 10:15:20.029907 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kz64z/crc-debug-pbkt5"] Dec 17 10:15:20 crc kubenswrapper[4935]: I1217 10:15:20.039037 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kz64z/crc-debug-pbkt5"] Dec 17 10:15:20 crc kubenswrapper[4935]: I1217 10:15:20.578300 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-pbkt5" Dec 17 10:15:20 crc kubenswrapper[4935]: I1217 10:15:20.769294 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-host\") pod \"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f\" (UID: \"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f\") " Dec 17 10:15:20 crc kubenswrapper[4935]: I1217 10:15:20.769378 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thdr8\" (UniqueName: \"kubernetes.io/projected/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-kube-api-access-thdr8\") pod \"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f\" (UID: \"cf24d0d4-91e2-42b2-bf28-d085d4c8b82f\") " Dec 17 10:15:20 crc kubenswrapper[4935]: I1217 10:15:20.770527 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-host" (OuterVolumeSpecName: "host") pod "cf24d0d4-91e2-42b2-bf28-d085d4c8b82f" (UID: "cf24d0d4-91e2-42b2-bf28-d085d4c8b82f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 10:15:20 crc kubenswrapper[4935]: I1217 10:15:20.776157 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-kube-api-access-thdr8" (OuterVolumeSpecName: "kube-api-access-thdr8") pod "cf24d0d4-91e2-42b2-bf28-d085d4c8b82f" (UID: "cf24d0d4-91e2-42b2-bf28-d085d4c8b82f"). InnerVolumeSpecName "kube-api-access-thdr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:15:20 crc kubenswrapper[4935]: I1217 10:15:20.876367 4935 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-host\") on node \"crc\" DevicePath \"\"" Dec 17 10:15:20 crc kubenswrapper[4935]: I1217 10:15:20.876422 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thdr8\" (UniqueName: \"kubernetes.io/projected/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f-kube-api-access-thdr8\") on node \"crc\" DevicePath \"\"" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.136557 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf24d0d4-91e2-42b2-bf28-d085d4c8b82f" path="/var/lib/kubelet/pods/cf24d0d4-91e2-42b2-bf28-d085d4c8b82f/volumes" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.184590 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kz64z/crc-debug-r4hnk"] Dec 17 10:15:21 crc kubenswrapper[4935]: E1217 10:15:21.184995 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf24d0d4-91e2-42b2-bf28-d085d4c8b82f" containerName="container-00" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.185011 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf24d0d4-91e2-42b2-bf28-d085d4c8b82f" containerName="container-00" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.185172 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf24d0d4-91e2-42b2-bf28-d085d4c8b82f" containerName="container-00" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.186176 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-r4hnk" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.387189 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cdfad77-6e91-42ad-969f-6e7104640dfb-host\") pod \"crc-debug-r4hnk\" (UID: \"6cdfad77-6e91-42ad-969f-6e7104640dfb\") " pod="openshift-must-gather-kz64z/crc-debug-r4hnk" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.387459 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t87lb\" (UniqueName: \"kubernetes.io/projected/6cdfad77-6e91-42ad-969f-6e7104640dfb-kube-api-access-t87lb\") pod \"crc-debug-r4hnk\" (UID: \"6cdfad77-6e91-42ad-969f-6e7104640dfb\") " pod="openshift-must-gather-kz64z/crc-debug-r4hnk" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.489072 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t87lb\" (UniqueName: \"kubernetes.io/projected/6cdfad77-6e91-42ad-969f-6e7104640dfb-kube-api-access-t87lb\") pod \"crc-debug-r4hnk\" (UID: \"6cdfad77-6e91-42ad-969f-6e7104640dfb\") " pod="openshift-must-gather-kz64z/crc-debug-r4hnk" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.489217 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cdfad77-6e91-42ad-969f-6e7104640dfb-host\") pod \"crc-debug-r4hnk\" (UID: \"6cdfad77-6e91-42ad-969f-6e7104640dfb\") " pod="openshift-must-gather-kz64z/crc-debug-r4hnk" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.489547 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cdfad77-6e91-42ad-969f-6e7104640dfb-host\") pod \"crc-debug-r4hnk\" (UID: \"6cdfad77-6e91-42ad-969f-6e7104640dfb\") " pod="openshift-must-gather-kz64z/crc-debug-r4hnk" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.491682 4935 scope.go:117] "RemoveContainer" containerID="1df9e514a6e83f8a1335d28536280795707884a3792bd76b14373ee7394fe178" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.491859 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-pbkt5" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.508392 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t87lb\" (UniqueName: \"kubernetes.io/projected/6cdfad77-6e91-42ad-969f-6e7104640dfb-kube-api-access-t87lb\") pod \"crc-debug-r4hnk\" (UID: \"6cdfad77-6e91-42ad-969f-6e7104640dfb\") " pod="openshift-must-gather-kz64z/crc-debug-r4hnk" Dec 17 10:15:21 crc kubenswrapper[4935]: I1217 10:15:21.803036 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-r4hnk" Dec 17 10:15:21 crc kubenswrapper[4935]: W1217 10:15:21.842043 4935 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cdfad77_6e91_42ad_969f_6e7104640dfb.slice/crio-667a29966cfb3d0783d8d192d18cefb3249762f77d3ba167a77adf7c759a569f WatchSource:0}: Error finding container 667a29966cfb3d0783d8d192d18cefb3249762f77d3ba167a77adf7c759a569f: Status 404 returned error can't find the container with id 667a29966cfb3d0783d8d192d18cefb3249762f77d3ba167a77adf7c759a569f Dec 17 10:15:22 crc kubenswrapper[4935]: I1217 10:15:22.502403 4935 generic.go:334] "Generic (PLEG): container finished" podID="6cdfad77-6e91-42ad-969f-6e7104640dfb" containerID="7e5f97129a98d4b16cf600a4dc01f2ab2565a649e194f51d42ff0bf4519f98e5" exitCode=0 Dec 17 10:15:22 crc kubenswrapper[4935]: I1217 10:15:22.502836 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/crc-debug-r4hnk" event={"ID":"6cdfad77-6e91-42ad-969f-6e7104640dfb","Type":"ContainerDied","Data":"7e5f97129a98d4b16cf600a4dc01f2ab2565a649e194f51d42ff0bf4519f98e5"} Dec 17 10:15:22 crc kubenswrapper[4935]: I1217 10:15:22.502864 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/crc-debug-r4hnk" event={"ID":"6cdfad77-6e91-42ad-969f-6e7104640dfb","Type":"ContainerStarted","Data":"667a29966cfb3d0783d8d192d18cefb3249762f77d3ba167a77adf7c759a569f"} Dec 17 10:15:22 crc kubenswrapper[4935]: I1217 10:15:22.540597 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kz64z/crc-debug-r4hnk"] Dec 17 10:15:22 crc kubenswrapper[4935]: I1217 10:15:22.552230 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kz64z/crc-debug-r4hnk"] Dec 17 10:15:23 crc kubenswrapper[4935]: I1217 10:15:23.621779 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-r4hnk" Dec 17 10:15:23 crc kubenswrapper[4935]: I1217 10:15:23.631378 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t87lb\" (UniqueName: \"kubernetes.io/projected/6cdfad77-6e91-42ad-969f-6e7104640dfb-kube-api-access-t87lb\") pod \"6cdfad77-6e91-42ad-969f-6e7104640dfb\" (UID: \"6cdfad77-6e91-42ad-969f-6e7104640dfb\") " Dec 17 10:15:23 crc kubenswrapper[4935]: I1217 10:15:23.631467 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cdfad77-6e91-42ad-969f-6e7104640dfb-host\") pod \"6cdfad77-6e91-42ad-969f-6e7104640dfb\" (UID: \"6cdfad77-6e91-42ad-969f-6e7104640dfb\") " Dec 17 10:15:23 crc kubenswrapper[4935]: I1217 10:15:23.631814 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6cdfad77-6e91-42ad-969f-6e7104640dfb-host" (OuterVolumeSpecName: "host") pod "6cdfad77-6e91-42ad-969f-6e7104640dfb" (UID: "6cdfad77-6e91-42ad-969f-6e7104640dfb"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 17 10:15:23 crc kubenswrapper[4935]: I1217 10:15:23.642033 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cdfad77-6e91-42ad-969f-6e7104640dfb-kube-api-access-t87lb" (OuterVolumeSpecName: "kube-api-access-t87lb") pod "6cdfad77-6e91-42ad-969f-6e7104640dfb" (UID: "6cdfad77-6e91-42ad-969f-6e7104640dfb"). InnerVolumeSpecName "kube-api-access-t87lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:15:23 crc kubenswrapper[4935]: I1217 10:15:23.733301 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t87lb\" (UniqueName: \"kubernetes.io/projected/6cdfad77-6e91-42ad-969f-6e7104640dfb-kube-api-access-t87lb\") on node \"crc\" DevicePath \"\"" Dec 17 10:15:23 crc kubenswrapper[4935]: I1217 10:15:23.733711 4935 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6cdfad77-6e91-42ad-969f-6e7104640dfb-host\") on node \"crc\" DevicePath \"\"" Dec 17 10:15:24 crc kubenswrapper[4935]: I1217 10:15:24.520614 4935 scope.go:117] "RemoveContainer" containerID="7e5f97129a98d4b16cf600a4dc01f2ab2565a649e194f51d42ff0bf4519f98e5" Dec 17 10:15:24 crc kubenswrapper[4935]: I1217 10:15:24.520662 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/crc-debug-r4hnk" Dec 17 10:15:25 crc kubenswrapper[4935]: I1217 10:15:25.136131 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cdfad77-6e91-42ad-969f-6e7104640dfb" path="/var/lib/kubelet/pods/6cdfad77-6e91-42ad-969f-6e7104640dfb/volumes" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.041662 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b5f4b44fb-ktsl4_e4ef2a77-ca25-495d-a00c-f15993955019/barbican-api/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.091195 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b5f4b44fb-ktsl4_e4ef2a77-ca25-495d-a00c-f15993955019/barbican-api-log/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.201563 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f4bb4f7d-ntfxs_4798a48d-9fdb-40d8-9890-595874a05215/barbican-keystone-listener/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.301509 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5f4bb4f7d-ntfxs_4798a48d-9fdb-40d8-9890-595874a05215/barbican-keystone-listener-log/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.326706 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69fff79bd9-55nbl_93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863/barbican-worker/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.370808 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69fff79bd9-55nbl_93f1b5f8-7ca2-4fd0-9887-fcf4eb22a863/barbican-worker-log/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.514532 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-d5mvz_1b3c1c73-3f87-4383-9d09-1931001f0629/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.640191 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053/ceilometer-central-agent/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.722137 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053/proxy-httpd/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.728747 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053/ceilometer-notification-agent/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.813308 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7d6f03f3-bf3c-478b-83fa-e1d1d2f0f053/sg-core/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.938224 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f6a66562-ec0b-4302-8e76-a4567917d90a/cinder-api/0.log" Dec 17 10:15:45 crc kubenswrapper[4935]: I1217 10:15:45.961439 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_f6a66562-ec0b-4302-8e76-a4567917d90a/cinder-api-log/0.log" Dec 17 10:15:46 crc kubenswrapper[4935]: I1217 10:15:46.158257 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7fb81e79-940d-4ba5-a10d-c22dca5377e0/probe/0.log" Dec 17 10:15:46 crc kubenswrapper[4935]: I1217 10:15:46.194954 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7fb81e79-940d-4ba5-a10d-c22dca5377e0/cinder-scheduler/0.log" Dec 17 10:15:46 crc kubenswrapper[4935]: I1217 10:15:46.273202 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-48fh2_a4733665-b253-4afc-b8a3-3028f3fb2892/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:46 crc kubenswrapper[4935]: I1217 10:15:46.403310 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-kwrww_dd4964d0-85a5-474f-a8f5-084210467887/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:46 crc kubenswrapper[4935]: I1217 10:15:46.480748 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-rp2ht_dd89ba3b-030b-43ae-9e1a-89d1401f81cd/init/0.log" Dec 17 10:15:46 crc kubenswrapper[4935]: I1217 10:15:46.643359 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-rp2ht_dd89ba3b-030b-43ae-9e1a-89d1401f81cd/init/0.log" Dec 17 10:15:46 crc kubenswrapper[4935]: I1217 10:15:46.719182 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-rp2ht_dd89ba3b-030b-43ae-9e1a-89d1401f81cd/dnsmasq-dns/0.log" Dec 17 10:15:46 crc kubenswrapper[4935]: I1217 10:15:46.734174 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8drs6_9a7d6590-bf03-479a-a094-259dd4efafef/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:46 crc kubenswrapper[4935]: I1217 10:15:46.923963 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2ba46087-25fc-485e-b41e-7e55dbd860c6/glance-log/0.log" Dec 17 10:15:46 crc kubenswrapper[4935]: I1217 10:15:46.926632 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_2ba46087-25fc-485e-b41e-7e55dbd860c6/glance-httpd/0.log" Dec 17 10:15:47 crc kubenswrapper[4935]: I1217 10:15:47.050072 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a918434d-797e-4c05-b048-8a5c5cbc18c0/glance-httpd/0.log" Dec 17 10:15:47 crc kubenswrapper[4935]: I1217 10:15:47.123470 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a918434d-797e-4c05-b048-8a5c5cbc18c0/glance-log/0.log" Dec 17 10:15:47 crc kubenswrapper[4935]: I1217 10:15:47.273354 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7bbfb6547d-64jt7_3658abd7-bc1e-4359-aa8b-011fe7189342/horizon/0.log" Dec 17 10:15:47 crc kubenswrapper[4935]: I1217 10:15:47.449363 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-j2m7c_54cfb029-5b74-4da9-b0d3-0033fe2b3968/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:47 crc kubenswrapper[4935]: I1217 10:15:47.705764 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-7bbfb6547d-64jt7_3658abd7-bc1e-4359-aa8b-011fe7189342/horizon-log/0.log" Dec 17 10:15:47 crc kubenswrapper[4935]: I1217 10:15:47.747089 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-fk9h4_a3a33180-b0e2-45f0-bcda-e3c49acfac29/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:47 crc kubenswrapper[4935]: I1217 10:15:47.969846 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29432761-2h8vv_6204d422-0f4a-40eb-a3ed-eb53e9220c9e/keystone-cron/0.log" Dec 17 10:15:48 crc kubenswrapper[4935]: I1217 10:15:48.008969 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-548ff6dcf4-brlq9_b3785d8e-a1d0-41db-81df-41ba57d019e5/keystone-api/0.log" Dec 17 10:15:48 crc kubenswrapper[4935]: I1217 10:15:48.140743 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_e33889bd-e62d-4b7c-83b1-a2ffc878b85a/kube-state-metrics/0.log" Dec 17 10:15:48 crc kubenswrapper[4935]: I1217 10:15:48.199768 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-znsdv_61772212-3ef5-4d2a-91be-96cd39dbb4e3/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:48 crc kubenswrapper[4935]: I1217 10:15:48.557780 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7c8457c6df-5qkkl_67591c00-7d49-4db4-af34-8901c57dbb0b/neutron-httpd/0.log" Dec 17 10:15:48 crc kubenswrapper[4935]: I1217 10:15:48.563654 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-7c8457c6df-5qkkl_67591c00-7d49-4db4-af34-8901c57dbb0b/neutron-api/0.log" Dec 17 10:15:48 crc kubenswrapper[4935]: I1217 10:15:48.774806 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-s68b6_4454b07b-03d5-46e3-8277-232e491c91c1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:49 crc kubenswrapper[4935]: I1217 10:15:49.273880 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bbb5ad70-4355-4513-8420-a2e99ea5a3be/nova-api-log/0.log" Dec 17 10:15:49 crc kubenswrapper[4935]: I1217 10:15:49.347385 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2efc3ff0-93d5-4ec4-b843-496b06524eb0/nova-cell0-conductor-conductor/0.log" Dec 17 10:15:49 crc kubenswrapper[4935]: I1217 10:15:49.598191 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_bbb5ad70-4355-4513-8420-a2e99ea5a3be/nova-api-api/0.log" Dec 17 10:15:49 crc kubenswrapper[4935]: I1217 10:15:49.615750 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_84e0cb55-6351-4230-bfd7-e89a1439df97/nova-cell1-conductor-conductor/0.log" Dec 17 10:15:49 crc kubenswrapper[4935]: I1217 10:15:49.701494 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b941dd61-25eb-4443-a4cf-356fbe73f67b/nova-cell1-novncproxy-novncproxy/0.log" Dec 17 10:15:49 crc kubenswrapper[4935]: I1217 10:15:49.939085 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-sp6wz_e1f63f1c-eca8-4e26-ab88-07f61efb54bb/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:49 crc kubenswrapper[4935]: I1217 10:15:49.955835 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b8366a42-0c62-4527-b173-f7bfdbd2223a/nova-metadata-log/0.log" Dec 17 10:15:50 crc kubenswrapper[4935]: I1217 10:15:50.342050 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_594360f1-b2e1-4e64-82d5-f6e471cd6850/mysql-bootstrap/0.log" Dec 17 10:15:50 crc kubenswrapper[4935]: I1217 10:15:50.403506 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_9a0706ae-c551-44cd-8fa7-ac4e2da28664/nova-scheduler-scheduler/0.log" Dec 17 10:15:50 crc kubenswrapper[4935]: I1217 10:15:50.516483 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_594360f1-b2e1-4e64-82d5-f6e471cd6850/mysql-bootstrap/0.log" Dec 17 10:15:50 crc kubenswrapper[4935]: I1217 10:15:50.533445 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_594360f1-b2e1-4e64-82d5-f6e471cd6850/galera/0.log" Dec 17 10:15:50 crc kubenswrapper[4935]: I1217 10:15:50.728799 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d3d1b63f-c619-4973-bad8-c90d12ccbbe1/mysql-bootstrap/0.log" Dec 17 10:15:50 crc kubenswrapper[4935]: I1217 10:15:50.911972 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d3d1b63f-c619-4973-bad8-c90d12ccbbe1/mysql-bootstrap/0.log" Dec 17 10:15:50 crc kubenswrapper[4935]: I1217 10:15:50.980312 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_d3d1b63f-c619-4973-bad8-c90d12ccbbe1/galera/0.log" Dec 17 10:15:51 crc kubenswrapper[4935]: I1217 10:15:51.101863 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_31f0e1f2-aecd-4e5e-96a6-deeee6e7bdb0/openstackclient/0.log" Dec 17 10:15:51 crc kubenswrapper[4935]: I1217 10:15:51.249304 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-l4zph_c1e01a1b-9baa-4738-8e44-b206863b4d3d/ovn-controller/0.log" Dec 17 10:15:51 crc kubenswrapper[4935]: I1217 10:15:51.434354 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-48lfw_7aab2b87-c484-40c4-a3c5-652f874476b2/openstack-network-exporter/0.log" Dec 17 10:15:51 crc kubenswrapper[4935]: I1217 10:15:51.486021 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b8366a42-0c62-4527-b173-f7bfdbd2223a/nova-metadata-metadata/0.log" Dec 17 10:15:51 crc kubenswrapper[4935]: I1217 10:15:51.574176 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jhrvs_dcc296df-d8ab-4200-ae5f-3ecf58c614b1/ovsdb-server-init/0.log" Dec 17 10:15:51 crc kubenswrapper[4935]: I1217 10:15:51.706006 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jhrvs_dcc296df-d8ab-4200-ae5f-3ecf58c614b1/ovs-vswitchd/0.log" Dec 17 10:15:51 crc kubenswrapper[4935]: I1217 10:15:51.730511 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jhrvs_dcc296df-d8ab-4200-ae5f-3ecf58c614b1/ovsdb-server/0.log" Dec 17 10:15:51 crc kubenswrapper[4935]: I1217 10:15:51.771771 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jhrvs_dcc296df-d8ab-4200-ae5f-3ecf58c614b1/ovsdb-server-init/0.log" Dec 17 10:15:51 crc kubenswrapper[4935]: I1217 10:15:51.981323 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-z6j8v_e59ff9f5-6277-4150-9d1b-91d323743ab8/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:52 crc kubenswrapper[4935]: I1217 10:15:52.022088 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d36637f4-f52a-47af-8f2d-439f62b55b8d/openstack-network-exporter/0.log" Dec 17 10:15:52 crc kubenswrapper[4935]: I1217 10:15:52.065691 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d36637f4-f52a-47af-8f2d-439f62b55b8d/ovn-northd/0.log" Dec 17 10:15:52 crc kubenswrapper[4935]: I1217 10:15:52.240039 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a4f8af4b-aed3-46a8-84a0-aeae265a1309/ovsdbserver-nb/0.log" Dec 17 10:15:52 crc kubenswrapper[4935]: I1217 10:15:52.256881 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a4f8af4b-aed3-46a8-84a0-aeae265a1309/openstack-network-exporter/0.log" Dec 17 10:15:52 crc kubenswrapper[4935]: I1217 10:15:52.448682 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dca41fc9-68e2-4f42-88fe-942695deca13/ovsdbserver-sb/0.log" Dec 17 10:15:52 crc kubenswrapper[4935]: I1217 10:15:52.458375 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_dca41fc9-68e2-4f42-88fe-942695deca13/openstack-network-exporter/0.log" Dec 17 10:15:52 crc kubenswrapper[4935]: I1217 10:15:52.683684 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5fc9d9db96-gh2fm_f4511340-5f20-486c-b7c3-2e4b04f60a14/placement-api/0.log" Dec 17 10:15:52 crc kubenswrapper[4935]: I1217 10:15:52.727729 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2fa8407-e7ea-46c7-8144-87caba8e2f45/setup-container/0.log" Dec 17 10:15:52 crc kubenswrapper[4935]: I1217 10:15:52.757782 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5fc9d9db96-gh2fm_f4511340-5f20-486c-b7c3-2e4b04f60a14/placement-log/0.log" Dec 17 10:15:52 crc kubenswrapper[4935]: I1217 10:15:52.929945 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2fa8407-e7ea-46c7-8144-87caba8e2f45/setup-container/0.log" Dec 17 10:15:53 crc kubenswrapper[4935]: I1217 10:15:53.516659 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a2fa8407-e7ea-46c7-8144-87caba8e2f45/rabbitmq/0.log" Dec 17 10:15:53 crc kubenswrapper[4935]: I1217 10:15:53.632095 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c25b888a-54e1-47f0-8932-fa07c02f30a1/setup-container/0.log" Dec 17 10:15:53 crc kubenswrapper[4935]: I1217 10:15:53.859721 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c25b888a-54e1-47f0-8932-fa07c02f30a1/rabbitmq/0.log" Dec 17 10:15:53 crc kubenswrapper[4935]: I1217 10:15:53.909897 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c25b888a-54e1-47f0-8932-fa07c02f30a1/setup-container/0.log" Dec 17 10:15:53 crc kubenswrapper[4935]: I1217 10:15:53.974137 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-q5z86_63b4de7b-7933-4eba-8248-5ef0db9caa3e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:54 crc kubenswrapper[4935]: I1217 10:15:54.107658 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-m9znx_d34c1a27-0426-4b46-bf51-77110a3929cd/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:54 crc kubenswrapper[4935]: I1217 10:15:54.418693 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-vs284_5ce01088-f49b-44be-b4a9-08cd183488de/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:54 crc kubenswrapper[4935]: I1217 10:15:54.553541 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-8cgmz_c2da33ef-f139-4ce2-9ef3-2a15cefcf653/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:54 crc kubenswrapper[4935]: I1217 10:15:54.643683 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-d6l7r_aa726742-b847-49f9-8c0b-5814e42e1c66/ssh-known-hosts-edpm-deployment/0.log" Dec 17 10:15:54 crc kubenswrapper[4935]: I1217 10:15:54.995795 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-767f646f97-whvbb_562d41ed-7767-48a4-9cf4-84405e6deb48/proxy-httpd/0.log" Dec 17 10:15:55 crc kubenswrapper[4935]: I1217 10:15:55.483896 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-767f646f97-whvbb_562d41ed-7767-48a4-9cf4-84405e6deb48/proxy-server/0.log" Dec 17 10:15:55 crc kubenswrapper[4935]: I1217 10:15:55.547579 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-jdzl5_499a79fe-0a9a-4a2c-98b1-1e0c99c2bdc7/swift-ring-rebalance/0.log" Dec 17 10:15:55 crc kubenswrapper[4935]: I1217 10:15:55.815945 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/account-auditor/0.log" Dec 17 10:15:55 crc kubenswrapper[4935]: I1217 10:15:55.819317 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/account-reaper/0.log" Dec 17 10:15:55 crc kubenswrapper[4935]: I1217 10:15:55.846217 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/account-replicator/0.log" Dec 17 10:15:55 crc kubenswrapper[4935]: I1217 10:15:55.891526 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/account-server/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.078782 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/container-auditor/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.090015 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/container-server/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.122770 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/container-updater/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.133113 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/container-replicator/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.279390 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/object-expirer/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.334676 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/object-server/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.361693 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/object-auditor/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.407594 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/object-replicator/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.487210 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/object-updater/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.594535 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/swift-recon-cron/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.645432 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_a2c97a57-ac68-4fab-acbd-ecdec8db5fb5/rsync/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.843248 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-xlf6n_ebc35f42-c9e2-422d-80d5-7d3a7b4e6dcb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:15:56 crc kubenswrapper[4935]: I1217 10:15:56.915059 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_36307e10-5953-420e-9627-2812d493abea/tempest-tests-tempest-tests-runner/0.log" Dec 17 10:15:57 crc kubenswrapper[4935]: I1217 10:15:57.023013 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_2575fc30-0353-4349-9f66-5915130b3e06/test-operator-logs-container/0.log" Dec 17 10:15:57 crc kubenswrapper[4935]: I1217 10:15:57.196303 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-mbwzt_2fabe8c0-2434-481a-9609-03c9ba3c30d5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 17 10:16:08 crc kubenswrapper[4935]: I1217 10:16:08.936435 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f4019f7f-3fa3-4d4d-976b-b81f43530f0e/memcached/0.log" Dec 17 10:16:23 crc kubenswrapper[4935]: I1217 10:16:23.403356 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/util/0.log" Dec 17 10:16:23 crc kubenswrapper[4935]: I1217 10:16:23.603798 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/pull/0.log" Dec 17 10:16:23 crc kubenswrapper[4935]: I1217 10:16:23.622312 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/util/0.log" Dec 17 10:16:23 crc kubenswrapper[4935]: I1217 10:16:23.622854 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/pull/0.log" Dec 17 10:16:23 crc kubenswrapper[4935]: I1217 10:16:23.821646 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/pull/0.log" Dec 17 10:16:23 crc kubenswrapper[4935]: I1217 10:16:23.822204 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/util/0.log" Dec 17 10:16:23 crc kubenswrapper[4935]: I1217 10:16:23.826587 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_66e648f2a060352786f57c72e95f29d5f4fff3a61780f7a5412ae0f0f2xjcr2_ee37f41b-412b-4273-959c-099aeb26681f/extract/0.log" Dec 17 10:16:24 crc kubenswrapper[4935]: I1217 10:16:24.082256 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5f98b4754f-9dv4l_83f649ce-a0cd-4405-9d8f-dee381d6f85a/manager/0.log" Dec 17 10:16:24 crc kubenswrapper[4935]: I1217 10:16:24.103114 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-95949466-vzv2z_ff77991e-cda3-4547-878f-9a2785b3a9ab/manager/0.log" Dec 17 10:16:24 crc kubenswrapper[4935]: I1217 10:16:24.250887 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66f8b87655-2t5dm_b7549908-f751-4d15-bac6-e8ebcb550a55/manager/0.log" Dec 17 10:16:24 crc kubenswrapper[4935]: I1217 10:16:24.367694 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-767f9d7567-94b2l_97f9414d-21a1-41dd-a4b0-cccffa57d46a/manager/0.log" Dec 17 10:16:25 crc kubenswrapper[4935]: I1217 10:16:25.093002 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-59b8dcb766-vwxjb_48d7ed73-e01d-48c1-98a4-22c4b3af76e3/manager/0.log" Dec 17 10:16:25 crc kubenswrapper[4935]: I1217 10:16:25.244956 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6ccf486b9-6jdgb_dec986d2-14bc-4419-8e9c-9a6f4b1959d2/manager/0.log" Dec 17 10:16:25 crc kubenswrapper[4935]: I1217 10:16:25.507832 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-f458558d7-tq9rr_8e14339f-174c-4065-8021-a3a8e56b7e16/manager/0.log" Dec 17 10:16:25 crc kubenswrapper[4935]: I1217 10:16:25.522332 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-84b495f78-4g8bl_58b1e21c-930d-4c0c-9469-3e37fd64b23d/manager/0.log" Dec 17 10:16:25 crc kubenswrapper[4935]: I1217 10:16:25.742749 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5c7cbf548f-f9sfx_9c887d00-51f3-4980-9b38-45f0d53780b8/manager/0.log" Dec 17 10:16:25 crc kubenswrapper[4935]: I1217 10:16:25.765650 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-5fdd9786f7-dvgdt_9c2a1fd2-b473-40ad-873d-d4d9b79d5808/manager/0.log" Dec 17 10:16:25 crc kubenswrapper[4935]: I1217 10:16:25.929582 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-f76f4954c-9nnjm_d8de4c04-1b17-45ca-9084-d69cd737bba2/manager/0.log" Dec 17 10:16:25 crc kubenswrapper[4935]: I1217 10:16:25.996309 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-7cd87b778f-zpm62_d7864302-210a-49dd-99ec-f33155990249/manager/0.log" Dec 17 10:16:26 crc kubenswrapper[4935]: I1217 10:16:26.185945 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5fbbf8b6cc-j59sf_b44d3640-477b-4ab4-b514-9e8aa8f03fa4/manager/0.log" Dec 17 10:16:26 crc kubenswrapper[4935]: I1217 10:16:26.220230 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-68c649d9d-hfklk_adcbaa5e-9235-4fbd-9641-929c51d02d00/manager/0.log" Dec 17 10:16:26 crc kubenswrapper[4935]: I1217 10:16:26.396585 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5c9b6778c7fhdj9_361cd3f0-4302-4641-8b23-bfdb3904015f/manager/0.log" Dec 17 10:16:27 crc kubenswrapper[4935]: I1217 10:16:27.276195 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fb28k_9a5f90ee-af6e-42ff-94b9-87b969461bee/registry-server/0.log" Dec 17 10:16:27 crc kubenswrapper[4935]: I1217 10:16:27.454092 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7ff595d8cc-s79xm_28ce268a-b7ac-4692-8870-063d1a26b9dc/operator/0.log" Dec 17 10:16:27 crc kubenswrapper[4935]: I1217 10:16:27.464317 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-5zhxw_de693205-2ea2-4b43-aefc-5d4dbc8650d9/manager/0.log" Dec 17 10:16:27 crc kubenswrapper[4935]: I1217 10:16:27.592565 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8665b56d78-ndj2g_08ebe1c7-8852-4b51-8042-cd2b26a5cf50/manager/0.log" Dec 17 10:16:27 crc kubenswrapper[4935]: I1217 10:16:27.627490 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5c897cfd74-2kxtc_cca50749-e7c9-4310-aaec-873208df4579/manager/0.log" Dec 17 10:16:27 crc kubenswrapper[4935]: I1217 10:16:27.682993 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xlq6d_80038f1d-56db-4e70-91cf-3cec348298cc/operator/0.log" Dec 17 10:16:27 crc kubenswrapper[4935]: I1217 10:16:27.790417 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5c6df8f9-rk9ml_f39def3f-b302-4d51-a636-752b4d23ded0/manager/0.log" Dec 17 10:16:27 crc kubenswrapper[4935]: I1217 10:16:27.929855 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-756ccf86c7-mnxj7_e782899f-29fc-40c3-b7ef-41bfc66a221f/manager/0.log" Dec 17 10:16:27 crc kubenswrapper[4935]: I1217 10:16:27.931888 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-97d456b9-2kr5x_bfc47d4c-a4db-4e06-ae0b-40afebb7c42e/manager/0.log" Dec 17 10:16:28 crc kubenswrapper[4935]: I1217 10:16:28.049307 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-55f78b7c4c-6cvpq_3f54488f-6b5c-458d-be0c-19b9248cc7b1/manager/0.log" Dec 17 10:16:30 crc kubenswrapper[4935]: I1217 10:16:30.130927 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:16:30 crc kubenswrapper[4935]: I1217 10:16:30.131405 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:16:46 crc kubenswrapper[4935]: I1217 10:16:46.271054 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-d7j8g_84312d40-3400-410d-9ba1-952f8ffbd442/control-plane-machine-set-operator/0.log" Dec 17 10:16:46 crc kubenswrapper[4935]: I1217 10:16:46.305903 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b2nl7_26bdd534-df87-4879-b036-377d8c606d5c/kube-rbac-proxy/0.log" Dec 17 10:16:46 crc kubenswrapper[4935]: I1217 10:16:46.464647 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-b2nl7_26bdd534-df87-4879-b036-377d8c606d5c/machine-api-operator/0.log" Dec 17 10:16:58 crc kubenswrapper[4935]: I1217 10:16:58.758378 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-r966z_ece85099-fa51-490d-a498-cf35ec83a8ad/cert-manager-controller/0.log" Dec 17 10:16:58 crc kubenswrapper[4935]: I1217 10:16:58.949068 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wbl2l_4d701ce2-c118-46f4-904b-5294c782ce68/cert-manager-cainjector/0.log" Dec 17 10:16:58 crc kubenswrapper[4935]: I1217 10:16:58.963704 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-5hx6v_be7a439b-efa4-4fdd-b56d-5e8d53f2ceb1/cert-manager-webhook/0.log" Dec 17 10:17:00 crc kubenswrapper[4935]: I1217 10:17:00.130582 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:17:00 crc kubenswrapper[4935]: I1217 10:17:00.131219 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:17:11 crc kubenswrapper[4935]: I1217 10:17:11.257048 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6ff7998486-wd26t_c10c7c6c-f801-40c4-bff9-0f7b740e662b/nmstate-console-plugin/0.log" Dec 17 10:17:11 crc kubenswrapper[4935]: I1217 10:17:11.396827 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-29rb4_ffdacd08-7751-465d-a6f2-9037a7307280/nmstate-handler/0.log" Dec 17 10:17:11 crc kubenswrapper[4935]: I1217 10:17:11.438398 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-n6kbf_c0fd208a-4408-45b2-88f7-979bf751ada6/kube-rbac-proxy/0.log" Dec 17 10:17:11 crc kubenswrapper[4935]: I1217 10:17:11.497236 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f7f7578db-n6kbf_c0fd208a-4408-45b2-88f7-979bf751ada6/nmstate-metrics/0.log" Dec 17 10:17:11 crc kubenswrapper[4935]: I1217 10:17:11.590750 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-6769fb99d-7z6d4_4fe03c14-1b10-4f5d-8107-3037bf3fd42e/nmstate-operator/0.log" Dec 17 10:17:11 crc kubenswrapper[4935]: I1217 10:17:11.696534 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-f8fb84555-whnkc_e4582acb-2858-4fab-8bc9-e8e6ee6589dd/nmstate-webhook/0.log" Dec 17 10:17:25 crc kubenswrapper[4935]: I1217 10:17:25.654116 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-sg456_30c162bc-0446-4ce3-a601-3fb687465161/kube-rbac-proxy/0.log" Dec 17 10:17:25 crc kubenswrapper[4935]: I1217 10:17:25.777147 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5bddd4b946-sg456_30c162bc-0446-4ce3-a601-3fb687465161/controller/0.log" Dec 17 10:17:25 crc kubenswrapper[4935]: I1217 10:17:25.900730 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-frr-files/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.052396 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-frr-files/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.069640 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-metrics/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.072868 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-reloader/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.105598 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-reloader/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.280555 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-frr-files/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.311619 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-reloader/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.323371 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-metrics/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.338186 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-metrics/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.512654 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-reloader/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.517503 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-metrics/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.522254 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/controller/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.529512 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/cp-frr-files/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.719506 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/kube-rbac-proxy-frr/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.721487 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/frr-metrics/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.722321 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/kube-rbac-proxy/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.917211 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/reloader/0.log" Dec 17 10:17:26 crc kubenswrapper[4935]: I1217 10:17:26.928793 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7784b6fcf-7pk92_4331fce4-cd29-4cfc-90c0-45a97c6596a4/frr-k8s-webhook-server/0.log" Dec 17 10:17:27 crc kubenswrapper[4935]: I1217 10:17:27.166256 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-64d8b49b46-7fst7_d1068b00-4182-43a7-aa77-e2521de014b7/manager/0.log" Dec 17 10:17:27 crc kubenswrapper[4935]: I1217 10:17:27.316094 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6c67684f-qpd2c_a7aeed44-1b4a-4d58-ac7f-077576b37887/webhook-server/0.log" Dec 17 10:17:27 crc kubenswrapper[4935]: I1217 10:17:27.352096 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z8lgk_137edfb5-6e98-4aef-8a75-bf14297a7d3d/kube-rbac-proxy/0.log" Dec 17 10:17:28 crc kubenswrapper[4935]: I1217 10:17:28.002716 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-z8lgk_137edfb5-6e98-4aef-8a75-bf14297a7d3d/speaker/0.log" Dec 17 10:17:28 crc kubenswrapper[4935]: I1217 10:17:28.396499 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-k268q_88717473-e7d0-4a23-b03b-5cada6284ad1/frr/0.log" Dec 17 10:17:30 crc kubenswrapper[4935]: I1217 10:17:30.130322 4935 patch_prober.go:28] interesting pod/machine-config-daemon-k7lhw container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 17 10:17:30 crc kubenswrapper[4935]: I1217 10:17:30.130739 4935 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 17 10:17:30 crc kubenswrapper[4935]: I1217 10:17:30.130778 4935 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" Dec 17 10:17:30 crc kubenswrapper[4935]: I1217 10:17:30.131308 4935 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916"} pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 17 10:17:30 crc kubenswrapper[4935]: I1217 10:17:30.131362 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerName="machine-config-daemon" containerID="cri-o://6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" gracePeriod=600 Dec 17 10:17:30 crc kubenswrapper[4935]: E1217 10:17:30.289824 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:17:30 crc kubenswrapper[4935]: I1217 10:17:30.580541 4935 generic.go:334] "Generic (PLEG): container finished" podID="6d8b2226-e518-487d-967a-78cbfd4da1dc" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" exitCode=0 Dec 17 10:17:30 crc kubenswrapper[4935]: I1217 10:17:30.580584 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerDied","Data":"6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916"} Dec 17 10:17:30 crc kubenswrapper[4935]: I1217 10:17:30.580617 4935 scope.go:117] "RemoveContainer" containerID="4bb02c25d5b1e7d9da62787747756e583e4f190d1c4af3c4ac15329c4859cbdc" Dec 17 10:17:30 crc kubenswrapper[4935]: I1217 10:17:30.581224 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:17:30 crc kubenswrapper[4935]: E1217 10:17:30.581550 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:17:41 crc kubenswrapper[4935]: I1217 10:17:41.448129 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/util/0.log" Dec 17 10:17:41 crc kubenswrapper[4935]: I1217 10:17:41.604986 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/pull/0.log" Dec 17 10:17:41 crc kubenswrapper[4935]: I1217 10:17:41.606244 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/util/0.log" Dec 17 10:17:41 crc kubenswrapper[4935]: I1217 10:17:41.660583 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/pull/0.log" Dec 17 10:17:41 crc kubenswrapper[4935]: I1217 10:17:41.831684 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/util/0.log" Dec 17 10:17:41 crc kubenswrapper[4935]: I1217 10:17:41.841455 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/pull/0.log" Dec 17 10:17:41 crc kubenswrapper[4935]: I1217 10:17:41.864506 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5b7fccbebf0e22d2dd769066fa7aaa90fd620c5db34f2af6c91e4319d4pwjb6_17e4c03a-b327-4bf9-8e1e-55b4fe3d7f8d/extract/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.038192 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/util/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.196568 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/pull/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.197163 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/pull/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.252449 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/util/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.416546 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/util/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.422250 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/pull/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.478122 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98085b0df3808ebec39f9f9529f737144fe2dbcdaa4f334014817c0fa8r65df_d50ec877-316c-4993-9906-7830748759d7/extract/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.611158 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-utilities/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.789434 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-utilities/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.835614 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-content/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.870833 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-content/0.log" Dec 17 10:17:42 crc kubenswrapper[4935]: I1217 10:17:42.983439 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-content/0.log" Dec 17 10:17:43 crc kubenswrapper[4935]: I1217 10:17:43.004912 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/extract-utilities/0.log" Dec 17 10:17:43 crc kubenswrapper[4935]: I1217 10:17:43.124334 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:17:43 crc kubenswrapper[4935]: E1217 10:17:43.124844 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:17:43 crc kubenswrapper[4935]: I1217 10:17:43.258242 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-utilities/0.log" Dec 17 10:17:43 crc kubenswrapper[4935]: I1217 10:17:43.466629 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-utilities/0.log" Dec 17 10:17:43 crc kubenswrapper[4935]: I1217 10:17:43.546517 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-content/0.log" Dec 17 10:17:43 crc kubenswrapper[4935]: I1217 10:17:43.547761 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-content/0.log" Dec 17 10:17:43 crc kubenswrapper[4935]: I1217 10:17:43.573020 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zpgjx_6e4d8134-256e-4ca3-adb5-7dc5beaf48ac/registry-server/0.log" Dec 17 10:17:43 crc kubenswrapper[4935]: I1217 10:17:43.719734 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-utilities/0.log" Dec 17 10:17:43 crc kubenswrapper[4935]: I1217 10:17:43.728825 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/extract-content/0.log" Dec 17 10:17:44 crc kubenswrapper[4935]: I1217 10:17:44.105298 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-utilities/0.log" Dec 17 10:17:44 crc kubenswrapper[4935]: I1217 10:17:44.110604 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-zkk6r_f1baaa40-be04-428b-aaca-a5235d3f167e/marketplace-operator/0.log" Dec 17 10:17:44 crc kubenswrapper[4935]: I1217 10:17:44.358541 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-content/0.log" Dec 17 10:17:44 crc kubenswrapper[4935]: I1217 10:17:44.427042 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-s9kxf_13e7e638-f648-4b5f-9589-9258a45be193/registry-server/0.log" Dec 17 10:17:44 crc kubenswrapper[4935]: I1217 10:17:44.431549 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-content/0.log" Dec 17 10:17:44 crc kubenswrapper[4935]: I1217 10:17:44.474445 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-utilities/0.log" Dec 17 10:17:44 crc kubenswrapper[4935]: I1217 10:17:44.621171 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-content/0.log" Dec 17 10:17:44 crc kubenswrapper[4935]: I1217 10:17:44.632933 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/extract-utilities/0.log" Dec 17 10:17:44 crc kubenswrapper[4935]: I1217 10:17:44.767112 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wvpv9_618e1775-1752-4cf2-b98a-562046bdc3f6/registry-server/0.log" Dec 17 10:17:45 crc kubenswrapper[4935]: I1217 10:17:45.015878 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-utilities/0.log" Dec 17 10:17:45 crc kubenswrapper[4935]: I1217 10:17:45.194808 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-content/0.log" Dec 17 10:17:45 crc kubenswrapper[4935]: I1217 10:17:45.220435 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-utilities/0.log" Dec 17 10:17:45 crc kubenswrapper[4935]: I1217 10:17:45.241808 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-content/0.log" Dec 17 10:17:45 crc kubenswrapper[4935]: I1217 10:17:45.444754 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-content/0.log" Dec 17 10:17:45 crc kubenswrapper[4935]: I1217 10:17:45.457724 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/extract-utilities/0.log" Dec 17 10:17:45 crc kubenswrapper[4935]: I1217 10:17:45.918404 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-55vfx_2fe737c1-a407-4da3-a492-a49c892b1db9/registry-server/0.log" Dec 17 10:17:56 crc kubenswrapper[4935]: I1217 10:17:56.124685 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:17:56 crc kubenswrapper[4935]: E1217 10:17:56.125671 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:18:11 crc kubenswrapper[4935]: I1217 10:18:11.131412 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:18:11 crc kubenswrapper[4935]: E1217 10:18:11.132680 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:18:24 crc kubenswrapper[4935]: I1217 10:18:24.125252 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:18:24 crc kubenswrapper[4935]: E1217 10:18:24.126432 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:18:35 crc kubenswrapper[4935]: I1217 10:18:35.124173 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:18:35 crc kubenswrapper[4935]: E1217 10:18:35.125014 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:18:50 crc kubenswrapper[4935]: I1217 10:18:50.124892 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:18:50 crc kubenswrapper[4935]: E1217 10:18:50.125801 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:19:04 crc kubenswrapper[4935]: I1217 10:19:04.124687 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:19:04 crc kubenswrapper[4935]: E1217 10:19:04.125545 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:19:15 crc kubenswrapper[4935]: I1217 10:19:15.125952 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:19:15 crc kubenswrapper[4935]: E1217 10:19:15.126767 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:19:30 crc kubenswrapper[4935]: I1217 10:19:30.124122 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:19:30 crc kubenswrapper[4935]: E1217 10:19:30.125157 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:19:30 crc kubenswrapper[4935]: I1217 10:19:30.629099 4935 generic.go:334] "Generic (PLEG): container finished" podID="70ded1d2-623a-4398-9b8b-25f62f65eb28" containerID="e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf" exitCode=0 Dec 17 10:19:30 crc kubenswrapper[4935]: I1217 10:19:30.629155 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz64z/must-gather-tr6hb" event={"ID":"70ded1d2-623a-4398-9b8b-25f62f65eb28","Type":"ContainerDied","Data":"e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf"} Dec 17 10:19:30 crc kubenswrapper[4935]: I1217 10:19:30.629926 4935 scope.go:117] "RemoveContainer" containerID="e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf" Dec 17 10:19:31 crc kubenswrapper[4935]: I1217 10:19:31.537725 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kz64z_must-gather-tr6hb_70ded1d2-623a-4398-9b8b-25f62f65eb28/gather/0.log" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.154083 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kz64z/must-gather-tr6hb"] Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.155092 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kz64z/must-gather-tr6hb"] Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.155323 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-kz64z/must-gather-tr6hb" podUID="70ded1d2-623a-4398-9b8b-25f62f65eb28" containerName="copy" containerID="cri-o://760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee" gracePeriod=2 Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.580543 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kz64z_must-gather-tr6hb_70ded1d2-623a-4398-9b8b-25f62f65eb28/copy/0.log" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.581497 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/must-gather-tr6hb" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.624654 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59bz8\" (UniqueName: \"kubernetes.io/projected/70ded1d2-623a-4398-9b8b-25f62f65eb28-kube-api-access-59bz8\") pod \"70ded1d2-623a-4398-9b8b-25f62f65eb28\" (UID: \"70ded1d2-623a-4398-9b8b-25f62f65eb28\") " Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.624915 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70ded1d2-623a-4398-9b8b-25f62f65eb28-must-gather-output\") pod \"70ded1d2-623a-4398-9b8b-25f62f65eb28\" (UID: \"70ded1d2-623a-4398-9b8b-25f62f65eb28\") " Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.630946 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70ded1d2-623a-4398-9b8b-25f62f65eb28-kube-api-access-59bz8" (OuterVolumeSpecName: "kube-api-access-59bz8") pod "70ded1d2-623a-4398-9b8b-25f62f65eb28" (UID: "70ded1d2-623a-4398-9b8b-25f62f65eb28"). InnerVolumeSpecName "kube-api-access-59bz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.724144 4935 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kz64z_must-gather-tr6hb_70ded1d2-623a-4398-9b8b-25f62f65eb28/copy/0.log" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.724836 4935 generic.go:334] "Generic (PLEG): container finished" podID="70ded1d2-623a-4398-9b8b-25f62f65eb28" containerID="760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee" exitCode=143 Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.724976 4935 scope.go:117] "RemoveContainer" containerID="760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.725100 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz64z/must-gather-tr6hb" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.727940 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59bz8\" (UniqueName: \"kubernetes.io/projected/70ded1d2-623a-4398-9b8b-25f62f65eb28-kube-api-access-59bz8\") on node \"crc\" DevicePath \"\"" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.762551 4935 scope.go:117] "RemoveContainer" containerID="e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.786905 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70ded1d2-623a-4398-9b8b-25f62f65eb28-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "70ded1d2-623a-4398-9b8b-25f62f65eb28" (UID: "70ded1d2-623a-4398-9b8b-25f62f65eb28"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.829902 4935 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/70ded1d2-623a-4398-9b8b-25f62f65eb28-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.841962 4935 scope.go:117] "RemoveContainer" containerID="760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee" Dec 17 10:19:41 crc kubenswrapper[4935]: E1217 10:19:41.843029 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee\": container with ID starting with 760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee not found: ID does not exist" containerID="760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.843073 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee"} err="failed to get container status \"760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee\": rpc error: code = NotFound desc = could not find container \"760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee\": container with ID starting with 760b545c8d76e674f62785a6b869e875be191644bb9f0d0b524d179acf589fee not found: ID does not exist" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.843099 4935 scope.go:117] "RemoveContainer" containerID="e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf" Dec 17 10:19:41 crc kubenswrapper[4935]: E1217 10:19:41.847382 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf\": container with ID starting with e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf not found: ID does not exist" containerID="e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf" Dec 17 10:19:41 crc kubenswrapper[4935]: I1217 10:19:41.847437 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf"} err="failed to get container status \"e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf\": rpc error: code = NotFound desc = could not find container \"e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf\": container with ID starting with e7c0840787b1cf5bd30ddff3bc34c5979626337cb196793945733cb9b3c743cf not found: ID does not exist" Dec 17 10:19:42 crc kubenswrapper[4935]: I1217 10:19:42.123741 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:19:42 crc kubenswrapper[4935]: E1217 10:19:42.124057 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:19:43 crc kubenswrapper[4935]: I1217 10:19:43.137252 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70ded1d2-623a-4398-9b8b-25f62f65eb28" path="/var/lib/kubelet/pods/70ded1d2-623a-4398-9b8b-25f62f65eb28/volumes" Dec 17 10:19:54 crc kubenswrapper[4935]: I1217 10:19:54.124022 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:19:54 crc kubenswrapper[4935]: E1217 10:19:54.125028 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:20:07 crc kubenswrapper[4935]: I1217 10:20:07.124499 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:20:07 crc kubenswrapper[4935]: E1217 10:20:07.125470 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:20:19 crc kubenswrapper[4935]: I1217 10:20:19.124395 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:20:19 crc kubenswrapper[4935]: E1217 10:20:19.125240 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:20:30 crc kubenswrapper[4935]: I1217 10:20:30.124710 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:20:30 crc kubenswrapper[4935]: E1217 10:20:30.125590 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.267070 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lhghs"] Dec 17 10:20:42 crc kubenswrapper[4935]: E1217 10:20:42.268050 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cdfad77-6e91-42ad-969f-6e7104640dfb" containerName="container-00" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.268063 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cdfad77-6e91-42ad-969f-6e7104640dfb" containerName="container-00" Dec 17 10:20:42 crc kubenswrapper[4935]: E1217 10:20:42.268092 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ded1d2-623a-4398-9b8b-25f62f65eb28" containerName="copy" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.268098 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ded1d2-623a-4398-9b8b-25f62f65eb28" containerName="copy" Dec 17 10:20:42 crc kubenswrapper[4935]: E1217 10:20:42.268111 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70ded1d2-623a-4398-9b8b-25f62f65eb28" containerName="gather" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.268118 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="70ded1d2-623a-4398-9b8b-25f62f65eb28" containerName="gather" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.268327 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ded1d2-623a-4398-9b8b-25f62f65eb28" containerName="gather" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.268357 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cdfad77-6e91-42ad-969f-6e7104640dfb" containerName="container-00" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.268365 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="70ded1d2-623a-4398-9b8b-25f62f65eb28" containerName="copy" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.269658 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.281419 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhghs"] Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.438421 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlsnd\" (UniqueName: \"kubernetes.io/projected/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-kube-api-access-jlsnd\") pod \"redhat-operators-lhghs\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.438682 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-utilities\") pod \"redhat-operators-lhghs\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.438892 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-catalog-content\") pod \"redhat-operators-lhghs\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.540958 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-utilities\") pod \"redhat-operators-lhghs\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.541033 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-catalog-content\") pod \"redhat-operators-lhghs\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.541113 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlsnd\" (UniqueName: \"kubernetes.io/projected/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-kube-api-access-jlsnd\") pod \"redhat-operators-lhghs\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.541700 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-catalog-content\") pod \"redhat-operators-lhghs\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.541700 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-utilities\") pod \"redhat-operators-lhghs\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.562422 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlsnd\" (UniqueName: \"kubernetes.io/projected/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-kube-api-access-jlsnd\") pod \"redhat-operators-lhghs\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:42 crc kubenswrapper[4935]: I1217 10:20:42.600022 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:43 crc kubenswrapper[4935]: I1217 10:20:43.056966 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lhghs"] Dec 17 10:20:43 crc kubenswrapper[4935]: I1217 10:20:43.125781 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:20:43 crc kubenswrapper[4935]: E1217 10:20:43.126611 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:20:43 crc kubenswrapper[4935]: I1217 10:20:43.249483 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhghs" event={"ID":"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93","Type":"ContainerStarted","Data":"1bb06d5bd83cdca9216f497a27e4090609281433e82954b8f9968d8ddc1b5aac"} Dec 17 10:20:44 crc kubenswrapper[4935]: I1217 10:20:44.261159 4935 generic.go:334] "Generic (PLEG): container finished" podID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerID="596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d" exitCode=0 Dec 17 10:20:44 crc kubenswrapper[4935]: I1217 10:20:44.261463 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhghs" event={"ID":"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93","Type":"ContainerDied","Data":"596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d"} Dec 17 10:20:44 crc kubenswrapper[4935]: I1217 10:20:44.263490 4935 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 17 10:20:45 crc kubenswrapper[4935]: I1217 10:20:45.272858 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhghs" event={"ID":"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93","Type":"ContainerStarted","Data":"701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467"} Dec 17 10:20:46 crc kubenswrapper[4935]: I1217 10:20:46.283742 4935 generic.go:334] "Generic (PLEG): container finished" podID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerID="701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467" exitCode=0 Dec 17 10:20:46 crc kubenswrapper[4935]: I1217 10:20:46.283845 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhghs" event={"ID":"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93","Type":"ContainerDied","Data":"701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467"} Dec 17 10:20:47 crc kubenswrapper[4935]: I1217 10:20:47.293924 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhghs" event={"ID":"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93","Type":"ContainerStarted","Data":"33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820"} Dec 17 10:20:47 crc kubenswrapper[4935]: I1217 10:20:47.320043 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lhghs" podStartSLOduration=2.805561517 podStartE2EDuration="5.320021589s" podCreationTimestamp="2025-12-17 10:20:42 +0000 UTC" firstStartedPulling="2025-12-17 10:20:44.26322731 +0000 UTC m=+4563.923068073" lastFinishedPulling="2025-12-17 10:20:46.777687382 +0000 UTC m=+4566.437528145" observedRunningTime="2025-12-17 10:20:47.309865651 +0000 UTC m=+4566.969706414" watchObservedRunningTime="2025-12-17 10:20:47.320021589 +0000 UTC m=+4566.979862362" Dec 17 10:20:52 crc kubenswrapper[4935]: I1217 10:20:52.600811 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:52 crc kubenswrapper[4935]: I1217 10:20:52.602498 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:52 crc kubenswrapper[4935]: I1217 10:20:52.641787 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:53 crc kubenswrapper[4935]: I1217 10:20:53.388102 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:53 crc kubenswrapper[4935]: I1217 10:20:53.436145 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhghs"] Dec 17 10:20:54 crc kubenswrapper[4935]: I1217 10:20:54.124665 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:20:54 crc kubenswrapper[4935]: E1217 10:20:54.125044 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:20:55 crc kubenswrapper[4935]: I1217 10:20:55.358648 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lhghs" podUID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerName="registry-server" containerID="cri-o://33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820" gracePeriod=2 Dec 17 10:20:55 crc kubenswrapper[4935]: I1217 10:20:55.800748 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:55 crc kubenswrapper[4935]: I1217 10:20:55.958560 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-utilities\") pod \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " Dec 17 10:20:55 crc kubenswrapper[4935]: I1217 10:20:55.958695 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-catalog-content\") pod \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " Dec 17 10:20:55 crc kubenswrapper[4935]: I1217 10:20:55.958746 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlsnd\" (UniqueName: \"kubernetes.io/projected/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-kube-api-access-jlsnd\") pod \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\" (UID: \"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93\") " Dec 17 10:20:55 crc kubenswrapper[4935]: I1217 10:20:55.959791 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-utilities" (OuterVolumeSpecName: "utilities") pod "75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" (UID: "75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:20:55 crc kubenswrapper[4935]: I1217 10:20:55.964895 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-kube-api-access-jlsnd" (OuterVolumeSpecName: "kube-api-access-jlsnd") pod "75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" (UID: "75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93"). InnerVolumeSpecName "kube-api-access-jlsnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.061681 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlsnd\" (UniqueName: \"kubernetes.io/projected/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-kube-api-access-jlsnd\") on node \"crc\" DevicePath \"\"" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.062348 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.375143 4935 generic.go:334] "Generic (PLEG): container finished" podID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerID="33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820" exitCode=0 Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.375196 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhghs" event={"ID":"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93","Type":"ContainerDied","Data":"33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820"} Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.375214 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lhghs" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.375235 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lhghs" event={"ID":"75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93","Type":"ContainerDied","Data":"1bb06d5bd83cdca9216f497a27e4090609281433e82954b8f9968d8ddc1b5aac"} Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.375257 4935 scope.go:117] "RemoveContainer" containerID="33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.397956 4935 scope.go:117] "RemoveContainer" containerID="701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.417527 4935 scope.go:117] "RemoveContainer" containerID="596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.461335 4935 scope.go:117] "RemoveContainer" containerID="33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820" Dec 17 10:20:56 crc kubenswrapper[4935]: E1217 10:20:56.461780 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820\": container with ID starting with 33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820 not found: ID does not exist" containerID="33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.461828 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820"} err="failed to get container status \"33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820\": rpc error: code = NotFound desc = could not find container \"33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820\": container with ID starting with 33badc018f57c41575c3c2bd413ebe91cf49510214633fb10951f20c5f54e820 not found: ID does not exist" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.461856 4935 scope.go:117] "RemoveContainer" containerID="701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467" Dec 17 10:20:56 crc kubenswrapper[4935]: E1217 10:20:56.462345 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467\": container with ID starting with 701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467 not found: ID does not exist" containerID="701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.462401 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467"} err="failed to get container status \"701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467\": rpc error: code = NotFound desc = could not find container \"701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467\": container with ID starting with 701a3ae969de41c502d783fcb8ebeac50203214c62151ef9216fcf6f1e38f467 not found: ID does not exist" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.462432 4935 scope.go:117] "RemoveContainer" containerID="596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d" Dec 17 10:20:56 crc kubenswrapper[4935]: E1217 10:20:56.462666 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d\": container with ID starting with 596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d not found: ID does not exist" containerID="596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d" Dec 17 10:20:56 crc kubenswrapper[4935]: I1217 10:20:56.462712 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d"} err="failed to get container status \"596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d\": rpc error: code = NotFound desc = could not find container \"596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d\": container with ID starting with 596a230ca79e35022f85b4823e0bd45ea369b38328c5c28687520bc4ded42d0d not found: ID does not exist" Dec 17 10:20:57 crc kubenswrapper[4935]: I1217 10:20:57.048133 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" (UID: "75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:20:57 crc kubenswrapper[4935]: I1217 10:20:57.082148 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 10:20:57 crc kubenswrapper[4935]: I1217 10:20:57.297156 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lhghs"] Dec 17 10:20:57 crc kubenswrapper[4935]: I1217 10:20:57.304421 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lhghs"] Dec 17 10:20:59 crc kubenswrapper[4935]: I1217 10:20:59.133666 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" path="/var/lib/kubelet/pods/75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93/volumes" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.591767 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8whm7"] Dec 17 10:21:00 crc kubenswrapper[4935]: E1217 10:21:00.592561 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerName="extract-content" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.592585 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerName="extract-content" Dec 17 10:21:00 crc kubenswrapper[4935]: E1217 10:21:00.592614 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerName="extract-utilities" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.592624 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerName="extract-utilities" Dec 17 10:21:00 crc kubenswrapper[4935]: E1217 10:21:00.592645 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerName="registry-server" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.592653 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerName="registry-server" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.592891 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d9dc64-fcbe-4f05-a1b3-e67e0da1fb93" containerName="registry-server" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.594694 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.617201 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8whm7"] Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.749611 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-utilities\") pod \"redhat-marketplace-8whm7\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.749745 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-catalog-content\") pod \"redhat-marketplace-8whm7\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.749829 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6gj5\" (UniqueName: \"kubernetes.io/projected/02073d9e-45ad-4087-96ba-d7bdd39872c4-kube-api-access-x6gj5\") pod \"redhat-marketplace-8whm7\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.851714 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-utilities\") pod \"redhat-marketplace-8whm7\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.851791 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-catalog-content\") pod \"redhat-marketplace-8whm7\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.851828 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6gj5\" (UniqueName: \"kubernetes.io/projected/02073d9e-45ad-4087-96ba-d7bdd39872c4-kube-api-access-x6gj5\") pod \"redhat-marketplace-8whm7\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.852434 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-utilities\") pod \"redhat-marketplace-8whm7\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.852518 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-catalog-content\") pod \"redhat-marketplace-8whm7\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.878443 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6gj5\" (UniqueName: \"kubernetes.io/projected/02073d9e-45ad-4087-96ba-d7bdd39872c4-kube-api-access-x6gj5\") pod \"redhat-marketplace-8whm7\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:00 crc kubenswrapper[4935]: I1217 10:21:00.915054 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.186108 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rndwn"] Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.188704 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.200327 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rndwn"] Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.287642 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-catalog-content\") pod \"certified-operators-rndwn\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.287791 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-utilities\") pod \"certified-operators-rndwn\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.287840 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhlzl\" (UniqueName: \"kubernetes.io/projected/945b5008-754e-48ec-8bf3-fef24ebad34f-kube-api-access-zhlzl\") pod \"certified-operators-rndwn\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.389422 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhlzl\" (UniqueName: \"kubernetes.io/projected/945b5008-754e-48ec-8bf3-fef24ebad34f-kube-api-access-zhlzl\") pod \"certified-operators-rndwn\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.389531 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-catalog-content\") pod \"certified-operators-rndwn\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.389663 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-utilities\") pod \"certified-operators-rndwn\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.390190 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-catalog-content\") pod \"certified-operators-rndwn\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.390226 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-utilities\") pod \"certified-operators-rndwn\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.407402 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhlzl\" (UniqueName: \"kubernetes.io/projected/945b5008-754e-48ec-8bf3-fef24ebad34f-kube-api-access-zhlzl\") pod \"certified-operators-rndwn\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.541562 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:01 crc kubenswrapper[4935]: I1217 10:21:01.716223 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8whm7"] Dec 17 10:21:02 crc kubenswrapper[4935]: I1217 10:21:02.075051 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rndwn"] Dec 17 10:21:02 crc kubenswrapper[4935]: I1217 10:21:02.423801 4935 generic.go:334] "Generic (PLEG): container finished" podID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerID="16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409" exitCode=0 Dec 17 10:21:02 crc kubenswrapper[4935]: I1217 10:21:02.423869 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8whm7" event={"ID":"02073d9e-45ad-4087-96ba-d7bdd39872c4","Type":"ContainerDied","Data":"16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409"} Dec 17 10:21:02 crc kubenswrapper[4935]: I1217 10:21:02.424305 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8whm7" event={"ID":"02073d9e-45ad-4087-96ba-d7bdd39872c4","Type":"ContainerStarted","Data":"48a928ee3b78ea7439454da8d79008f5486028676a10a8368c8cb1fe82c7adb9"} Dec 17 10:21:02 crc kubenswrapper[4935]: I1217 10:21:02.426971 4935 generic.go:334] "Generic (PLEG): container finished" podID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerID="67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32" exitCode=0 Dec 17 10:21:02 crc kubenswrapper[4935]: I1217 10:21:02.426999 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rndwn" event={"ID":"945b5008-754e-48ec-8bf3-fef24ebad34f","Type":"ContainerDied","Data":"67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32"} Dec 17 10:21:02 crc kubenswrapper[4935]: I1217 10:21:02.427016 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rndwn" event={"ID":"945b5008-754e-48ec-8bf3-fef24ebad34f","Type":"ContainerStarted","Data":"491a23f2073950f3a0ea08de5c6502decae88a9db7bed8b8cd88b62c04a37dc2"} Dec 17 10:21:04 crc kubenswrapper[4935]: I1217 10:21:04.449716 4935 generic.go:334] "Generic (PLEG): container finished" podID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerID="dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3" exitCode=0 Dec 17 10:21:04 crc kubenswrapper[4935]: I1217 10:21:04.450415 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8whm7" event={"ID":"02073d9e-45ad-4087-96ba-d7bdd39872c4","Type":"ContainerDied","Data":"dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3"} Dec 17 10:21:04 crc kubenswrapper[4935]: I1217 10:21:04.454036 4935 generic.go:334] "Generic (PLEG): container finished" podID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerID="41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595" exitCode=0 Dec 17 10:21:04 crc kubenswrapper[4935]: I1217 10:21:04.454068 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rndwn" event={"ID":"945b5008-754e-48ec-8bf3-fef24ebad34f","Type":"ContainerDied","Data":"41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595"} Dec 17 10:21:05 crc kubenswrapper[4935]: I1217 10:21:05.465234 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rndwn" event={"ID":"945b5008-754e-48ec-8bf3-fef24ebad34f","Type":"ContainerStarted","Data":"e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e"} Dec 17 10:21:05 crc kubenswrapper[4935]: I1217 10:21:05.468425 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8whm7" event={"ID":"02073d9e-45ad-4087-96ba-d7bdd39872c4","Type":"ContainerStarted","Data":"537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c"} Dec 17 10:21:05 crc kubenswrapper[4935]: I1217 10:21:05.488327 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rndwn" podStartSLOduration=1.8568637730000002 podStartE2EDuration="4.488307837s" podCreationTimestamp="2025-12-17 10:21:01 +0000 UTC" firstStartedPulling="2025-12-17 10:21:02.428309819 +0000 UTC m=+4582.088150592" lastFinishedPulling="2025-12-17 10:21:05.059753893 +0000 UTC m=+4584.719594656" observedRunningTime="2025-12-17 10:21:05.485132469 +0000 UTC m=+4585.144973232" watchObservedRunningTime="2025-12-17 10:21:05.488307837 +0000 UTC m=+4585.148148600" Dec 17 10:21:05 crc kubenswrapper[4935]: I1217 10:21:05.507880 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8whm7" podStartSLOduration=2.7420570140000002 podStartE2EDuration="5.507861595s" podCreationTimestamp="2025-12-17 10:21:00 +0000 UTC" firstStartedPulling="2025-12-17 10:21:02.425056689 +0000 UTC m=+4582.084897452" lastFinishedPulling="2025-12-17 10:21:05.19086127 +0000 UTC m=+4584.850702033" observedRunningTime="2025-12-17 10:21:05.504140714 +0000 UTC m=+4585.163981477" watchObservedRunningTime="2025-12-17 10:21:05.507861595 +0000 UTC m=+4585.167702358" Dec 17 10:21:06 crc kubenswrapper[4935]: I1217 10:21:06.124726 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:21:06 crc kubenswrapper[4935]: E1217 10:21:06.125242 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:21:10 crc kubenswrapper[4935]: I1217 10:21:10.916093 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:10 crc kubenswrapper[4935]: I1217 10:21:10.916466 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:10 crc kubenswrapper[4935]: I1217 10:21:10.965755 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:11 crc kubenswrapper[4935]: I1217 10:21:11.543073 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:11 crc kubenswrapper[4935]: I1217 10:21:11.543119 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:11 crc kubenswrapper[4935]: I1217 10:21:11.686985 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:11 crc kubenswrapper[4935]: I1217 10:21:11.687107 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:11 crc kubenswrapper[4935]: I1217 10:21:11.729643 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8whm7"] Dec 17 10:21:12 crc kubenswrapper[4935]: I1217 10:21:12.568270 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:13 crc kubenswrapper[4935]: I1217 10:21:13.531391 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8whm7" podUID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerName="registry-server" containerID="cri-o://537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c" gracePeriod=2 Dec 17 10:21:13 crc kubenswrapper[4935]: I1217 10:21:13.603586 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rndwn"] Dec 17 10:21:13 crc kubenswrapper[4935]: I1217 10:21:13.986497 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.135260 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-catalog-content\") pod \"02073d9e-45ad-4087-96ba-d7bdd39872c4\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.135844 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-utilities\") pod \"02073d9e-45ad-4087-96ba-d7bdd39872c4\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.135916 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6gj5\" (UniqueName: \"kubernetes.io/projected/02073d9e-45ad-4087-96ba-d7bdd39872c4-kube-api-access-x6gj5\") pod \"02073d9e-45ad-4087-96ba-d7bdd39872c4\" (UID: \"02073d9e-45ad-4087-96ba-d7bdd39872c4\") " Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.136812 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-utilities" (OuterVolumeSpecName: "utilities") pod "02073d9e-45ad-4087-96ba-d7bdd39872c4" (UID: "02073d9e-45ad-4087-96ba-d7bdd39872c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.143822 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02073d9e-45ad-4087-96ba-d7bdd39872c4-kube-api-access-x6gj5" (OuterVolumeSpecName: "kube-api-access-x6gj5") pod "02073d9e-45ad-4087-96ba-d7bdd39872c4" (UID: "02073d9e-45ad-4087-96ba-d7bdd39872c4"). InnerVolumeSpecName "kube-api-access-x6gj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.238794 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.238831 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6gj5\" (UniqueName: \"kubernetes.io/projected/02073d9e-45ad-4087-96ba-d7bdd39872c4-kube-api-access-x6gj5\") on node \"crc\" DevicePath \"\"" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.260639 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02073d9e-45ad-4087-96ba-d7bdd39872c4" (UID: "02073d9e-45ad-4087-96ba-d7bdd39872c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.340604 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02073d9e-45ad-4087-96ba-d7bdd39872c4-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.540071 4935 generic.go:334] "Generic (PLEG): container finished" podID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerID="537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c" exitCode=0 Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.540134 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8whm7" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.540118 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8whm7" event={"ID":"02073d9e-45ad-4087-96ba-d7bdd39872c4","Type":"ContainerDied","Data":"537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c"} Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.540327 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8whm7" event={"ID":"02073d9e-45ad-4087-96ba-d7bdd39872c4","Type":"ContainerDied","Data":"48a928ee3b78ea7439454da8d79008f5486028676a10a8368c8cb1fe82c7adb9"} Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.540379 4935 scope.go:117] "RemoveContainer" containerID="537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.540659 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rndwn" podUID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerName="registry-server" containerID="cri-o://e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e" gracePeriod=2 Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.561028 4935 scope.go:117] "RemoveContainer" containerID="dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.578574 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8whm7"] Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.588159 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8whm7"] Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.594143 4935 scope.go:117] "RemoveContainer" containerID="16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.753634 4935 scope.go:117] "RemoveContainer" containerID="537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c" Dec 17 10:21:14 crc kubenswrapper[4935]: E1217 10:21:14.754041 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c\": container with ID starting with 537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c not found: ID does not exist" containerID="537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.754087 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c"} err="failed to get container status \"537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c\": rpc error: code = NotFound desc = could not find container \"537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c\": container with ID starting with 537baca62b64db891d9aba73aa34bac7ef4a417eab2ecd90c05bbfd489bb442c not found: ID does not exist" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.754127 4935 scope.go:117] "RemoveContainer" containerID="dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3" Dec 17 10:21:14 crc kubenswrapper[4935]: E1217 10:21:14.754452 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3\": container with ID starting with dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3 not found: ID does not exist" containerID="dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.754781 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3"} err="failed to get container status \"dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3\": rpc error: code = NotFound desc = could not find container \"dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3\": container with ID starting with dcc98219225288e112e9a11e0816da6dd32fdb80ce02e8eff67a015c552da1f3 not found: ID does not exist" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.754795 4935 scope.go:117] "RemoveContainer" containerID="16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409" Dec 17 10:21:14 crc kubenswrapper[4935]: E1217 10:21:14.755054 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409\": container with ID starting with 16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409 not found: ID does not exist" containerID="16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.755100 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409"} err="failed to get container status \"16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409\": rpc error: code = NotFound desc = could not find container \"16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409\": container with ID starting with 16f22d01548727a1fd67472b4193fe6cfc9ad5e00358904d0ce54365238e1409 not found: ID does not exist" Dec 17 10:21:14 crc kubenswrapper[4935]: I1217 10:21:14.984141 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.136021 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02073d9e-45ad-4087-96ba-d7bdd39872c4" path="/var/lib/kubelet/pods/02073d9e-45ad-4087-96ba-d7bdd39872c4/volumes" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.158289 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-catalog-content\") pod \"945b5008-754e-48ec-8bf3-fef24ebad34f\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.158536 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-utilities\") pod \"945b5008-754e-48ec-8bf3-fef24ebad34f\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.158603 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhlzl\" (UniqueName: \"kubernetes.io/projected/945b5008-754e-48ec-8bf3-fef24ebad34f-kube-api-access-zhlzl\") pod \"945b5008-754e-48ec-8bf3-fef24ebad34f\" (UID: \"945b5008-754e-48ec-8bf3-fef24ebad34f\") " Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.159485 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-utilities" (OuterVolumeSpecName: "utilities") pod "945b5008-754e-48ec-8bf3-fef24ebad34f" (UID: "945b5008-754e-48ec-8bf3-fef24ebad34f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.159760 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.164465 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945b5008-754e-48ec-8bf3-fef24ebad34f-kube-api-access-zhlzl" (OuterVolumeSpecName: "kube-api-access-zhlzl") pod "945b5008-754e-48ec-8bf3-fef24ebad34f" (UID: "945b5008-754e-48ec-8bf3-fef24ebad34f"). InnerVolumeSpecName "kube-api-access-zhlzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.212000 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "945b5008-754e-48ec-8bf3-fef24ebad34f" (UID: "945b5008-754e-48ec-8bf3-fef24ebad34f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.262290 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhlzl\" (UniqueName: \"kubernetes.io/projected/945b5008-754e-48ec-8bf3-fef24ebad34f-kube-api-access-zhlzl\") on node \"crc\" DevicePath \"\"" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.262329 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/945b5008-754e-48ec-8bf3-fef24ebad34f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.550198 4935 generic.go:334] "Generic (PLEG): container finished" podID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerID="e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e" exitCode=0 Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.550331 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rndwn" event={"ID":"945b5008-754e-48ec-8bf3-fef24ebad34f","Type":"ContainerDied","Data":"e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e"} Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.550408 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rndwn" event={"ID":"945b5008-754e-48ec-8bf3-fef24ebad34f","Type":"ContainerDied","Data":"491a23f2073950f3a0ea08de5c6502decae88a9db7bed8b8cd88b62c04a37dc2"} Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.550429 4935 scope.go:117] "RemoveContainer" containerID="e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.550595 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rndwn" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.568807 4935 scope.go:117] "RemoveContainer" containerID="41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.586147 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rndwn"] Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.594229 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rndwn"] Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.604588 4935 scope.go:117] "RemoveContainer" containerID="67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.620968 4935 scope.go:117] "RemoveContainer" containerID="e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e" Dec 17 10:21:15 crc kubenswrapper[4935]: E1217 10:21:15.621355 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e\": container with ID starting with e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e not found: ID does not exist" containerID="e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.621420 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e"} err="failed to get container status \"e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e\": rpc error: code = NotFound desc = could not find container \"e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e\": container with ID starting with e96de861d1db658ccb3ae8d953be2c67176fe953ae14c13662c16d3a5e3ff71e not found: ID does not exist" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.621444 4935 scope.go:117] "RemoveContainer" containerID="41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595" Dec 17 10:21:15 crc kubenswrapper[4935]: E1217 10:21:15.621726 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595\": container with ID starting with 41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595 not found: ID does not exist" containerID="41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.621773 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595"} err="failed to get container status \"41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595\": rpc error: code = NotFound desc = could not find container \"41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595\": container with ID starting with 41da0886edb574b05e55a109c34918e50b40357e4a3f7a3044807adf39793595 not found: ID does not exist" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.621801 4935 scope.go:117] "RemoveContainer" containerID="67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32" Dec 17 10:21:15 crc kubenswrapper[4935]: E1217 10:21:15.622079 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32\": container with ID starting with 67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32 not found: ID does not exist" containerID="67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32" Dec 17 10:21:15 crc kubenswrapper[4935]: I1217 10:21:15.622170 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32"} err="failed to get container status \"67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32\": rpc error: code = NotFound desc = could not find container \"67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32\": container with ID starting with 67ef45a4d2bfb38a032f411c48d4b1ce4e0ec5d71d83157020a31aecf5786d32 not found: ID does not exist" Dec 17 10:21:17 crc kubenswrapper[4935]: I1217 10:21:17.135569 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="945b5008-754e-48ec-8bf3-fef24ebad34f" path="/var/lib/kubelet/pods/945b5008-754e-48ec-8bf3-fef24ebad34f/volumes" Dec 17 10:21:18 crc kubenswrapper[4935]: I1217 10:21:18.125145 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:21:18 crc kubenswrapper[4935]: E1217 10:21:18.125984 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:21:32 crc kubenswrapper[4935]: I1217 10:21:32.125519 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:21:32 crc kubenswrapper[4935]: E1217 10:21:32.126877 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:21:46 crc kubenswrapper[4935]: I1217 10:21:46.124732 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:21:46 crc kubenswrapper[4935]: E1217 10:21:46.125566 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:22:00 crc kubenswrapper[4935]: I1217 10:22:00.124639 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:22:00 crc kubenswrapper[4935]: E1217 10:22:00.125318 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:22:14 crc kubenswrapper[4935]: I1217 10:22:14.123980 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:22:14 crc kubenswrapper[4935]: E1217 10:22:14.124741 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:22:28 crc kubenswrapper[4935]: I1217 10:22:28.124155 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:22:28 crc kubenswrapper[4935]: E1217 10:22:28.125044 4935 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-k7lhw_openshift-machine-config-operator(6d8b2226-e518-487d-967a-78cbfd4da1dc)\"" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" podUID="6d8b2226-e518-487d-967a-78cbfd4da1dc" Dec 17 10:22:43 crc kubenswrapper[4935]: I1217 10:22:43.124383 4935 scope.go:117] "RemoveContainer" containerID="6b6455147cdbf2d22c3498988444c711337fa472352b1382a157de0e1ee77916" Dec 17 10:22:44 crc kubenswrapper[4935]: I1217 10:22:44.420223 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-k7lhw" event={"ID":"6d8b2226-e518-487d-967a-78cbfd4da1dc","Type":"ContainerStarted","Data":"4c679f666e716cad221a9d638498194351a97404c4ed5cc5f3b10fe1c3d5e8c0"} Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.879884 4935 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wzr4l"] Dec 17 10:23:08 crc kubenswrapper[4935]: E1217 10:23:08.880985 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerName="extract-content" Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.881001 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerName="extract-content" Dec 17 10:23:08 crc kubenswrapper[4935]: E1217 10:23:08.881021 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerName="extract-content" Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.881029 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerName="extract-content" Dec 17 10:23:08 crc kubenswrapper[4935]: E1217 10:23:08.881046 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerName="extract-utilities" Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.881054 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerName="extract-utilities" Dec 17 10:23:08 crc kubenswrapper[4935]: E1217 10:23:08.881069 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerName="registry-server" Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.881076 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerName="registry-server" Dec 17 10:23:08 crc kubenswrapper[4935]: E1217 10:23:08.881086 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerName="extract-utilities" Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.881093 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerName="extract-utilities" Dec 17 10:23:08 crc kubenswrapper[4935]: E1217 10:23:08.881125 4935 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerName="registry-server" Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.881132 4935 state_mem.go:107] "Deleted CPUSet assignment" podUID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerName="registry-server" Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.881385 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="02073d9e-45ad-4087-96ba-d7bdd39872c4" containerName="registry-server" Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.881417 4935 memory_manager.go:354] "RemoveStaleState removing state" podUID="945b5008-754e-48ec-8bf3-fef24ebad34f" containerName="registry-server" Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.883318 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:08 crc kubenswrapper[4935]: I1217 10:23:08.890915 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzr4l"] Dec 17 10:23:09 crc kubenswrapper[4935]: I1217 10:23:09.001418 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx5hn\" (UniqueName: \"kubernetes.io/projected/a0347b99-ecda-4bd9-b81c-0fef663e0b49-kube-api-access-mx5hn\") pod \"community-operators-wzr4l\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:09 crc kubenswrapper[4935]: I1217 10:23:09.001474 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-utilities\") pod \"community-operators-wzr4l\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:09 crc kubenswrapper[4935]: I1217 10:23:09.001561 4935 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-catalog-content\") pod \"community-operators-wzr4l\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:09 crc kubenswrapper[4935]: I1217 10:23:09.102809 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-catalog-content\") pod \"community-operators-wzr4l\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:09 crc kubenswrapper[4935]: I1217 10:23:09.103265 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5hn\" (UniqueName: \"kubernetes.io/projected/a0347b99-ecda-4bd9-b81c-0fef663e0b49-kube-api-access-mx5hn\") pod \"community-operators-wzr4l\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:09 crc kubenswrapper[4935]: I1217 10:23:09.103330 4935 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-utilities\") pod \"community-operators-wzr4l\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:09 crc kubenswrapper[4935]: I1217 10:23:09.103441 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-catalog-content\") pod \"community-operators-wzr4l\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:09 crc kubenswrapper[4935]: I1217 10:23:09.103666 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-utilities\") pod \"community-operators-wzr4l\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:09 crc kubenswrapper[4935]: I1217 10:23:09.261312 4935 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx5hn\" (UniqueName: \"kubernetes.io/projected/a0347b99-ecda-4bd9-b81c-0fef663e0b49-kube-api-access-mx5hn\") pod \"community-operators-wzr4l\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:09 crc kubenswrapper[4935]: I1217 10:23:09.550215 4935 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:10 crc kubenswrapper[4935]: I1217 10:23:10.043158 4935 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wzr4l"] Dec 17 10:23:10 crc kubenswrapper[4935]: I1217 10:23:10.998020 4935 generic.go:334] "Generic (PLEG): container finished" podID="a0347b99-ecda-4bd9-b81c-0fef663e0b49" containerID="593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9" exitCode=0 Dec 17 10:23:10 crc kubenswrapper[4935]: I1217 10:23:10.998168 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzr4l" event={"ID":"a0347b99-ecda-4bd9-b81c-0fef663e0b49","Type":"ContainerDied","Data":"593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9"} Dec 17 10:23:10 crc kubenswrapper[4935]: I1217 10:23:10.998322 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzr4l" event={"ID":"a0347b99-ecda-4bd9-b81c-0fef663e0b49","Type":"ContainerStarted","Data":"11a0a0863e091f57b6d3e4a8fd0744b542bb0fb4303c564ba9f3886af1add01b"} Dec 17 10:23:13 crc kubenswrapper[4935]: I1217 10:23:13.017562 4935 generic.go:334] "Generic (PLEG): container finished" podID="a0347b99-ecda-4bd9-b81c-0fef663e0b49" containerID="ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9" exitCode=0 Dec 17 10:23:13 crc kubenswrapper[4935]: I1217 10:23:13.017654 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzr4l" event={"ID":"a0347b99-ecda-4bd9-b81c-0fef663e0b49","Type":"ContainerDied","Data":"ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9"} Dec 17 10:23:14 crc kubenswrapper[4935]: I1217 10:23:14.028690 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzr4l" event={"ID":"a0347b99-ecda-4bd9-b81c-0fef663e0b49","Type":"ContainerStarted","Data":"cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064"} Dec 17 10:23:14 crc kubenswrapper[4935]: I1217 10:23:14.055542 4935 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wzr4l" podStartSLOduration=3.442785599 podStartE2EDuration="6.055522735s" podCreationTimestamp="2025-12-17 10:23:08 +0000 UTC" firstStartedPulling="2025-12-17 10:23:11.007293515 +0000 UTC m=+4710.667134278" lastFinishedPulling="2025-12-17 10:23:13.620030651 +0000 UTC m=+4713.279871414" observedRunningTime="2025-12-17 10:23:14.049304653 +0000 UTC m=+4713.709145416" watchObservedRunningTime="2025-12-17 10:23:14.055522735 +0000 UTC m=+4713.715363498" Dec 17 10:23:19 crc kubenswrapper[4935]: I1217 10:23:19.550802 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:19 crc kubenswrapper[4935]: I1217 10:23:19.551358 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:19 crc kubenswrapper[4935]: I1217 10:23:19.893886 4935 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:20 crc kubenswrapper[4935]: I1217 10:23:20.127432 4935 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:20 crc kubenswrapper[4935]: I1217 10:23:20.169584 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wzr4l"] Dec 17 10:23:22 crc kubenswrapper[4935]: I1217 10:23:22.090883 4935 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wzr4l" podUID="a0347b99-ecda-4bd9-b81c-0fef663e0b49" containerName="registry-server" containerID="cri-o://cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064" gracePeriod=2 Dec 17 10:23:22 crc kubenswrapper[4935]: I1217 10:23:22.521004 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:22 crc kubenswrapper[4935]: I1217 10:23:22.548123 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-utilities\") pod \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " Dec 17 10:23:22 crc kubenswrapper[4935]: I1217 10:23:22.548237 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx5hn\" (UniqueName: \"kubernetes.io/projected/a0347b99-ecda-4bd9-b81c-0fef663e0b49-kube-api-access-mx5hn\") pod \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " Dec 17 10:23:22 crc kubenswrapper[4935]: I1217 10:23:22.548378 4935 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-catalog-content\") pod \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\" (UID: \"a0347b99-ecda-4bd9-b81c-0fef663e0b49\") " Dec 17 10:23:22 crc kubenswrapper[4935]: I1217 10:23:22.550091 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-utilities" (OuterVolumeSpecName: "utilities") pod "a0347b99-ecda-4bd9-b81c-0fef663e0b49" (UID: "a0347b99-ecda-4bd9-b81c-0fef663e0b49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:23:22 crc kubenswrapper[4935]: I1217 10:23:22.555088 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0347b99-ecda-4bd9-b81c-0fef663e0b49-kube-api-access-mx5hn" (OuterVolumeSpecName: "kube-api-access-mx5hn") pod "a0347b99-ecda-4bd9-b81c-0fef663e0b49" (UID: "a0347b99-ecda-4bd9-b81c-0fef663e0b49"). InnerVolumeSpecName "kube-api-access-mx5hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 17 10:23:22 crc kubenswrapper[4935]: I1217 10:23:22.650120 4935 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-utilities\") on node \"crc\" DevicePath \"\"" Dec 17 10:23:22 crc kubenswrapper[4935]: I1217 10:23:22.650166 4935 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx5hn\" (UniqueName: \"kubernetes.io/projected/a0347b99-ecda-4bd9-b81c-0fef663e0b49-kube-api-access-mx5hn\") on node \"crc\" DevicePath \"\"" Dec 17 10:23:22 crc kubenswrapper[4935]: I1217 10:23:22.964991 4935 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a0347b99-ecda-4bd9-b81c-0fef663e0b49" (UID: "a0347b99-ecda-4bd9-b81c-0fef663e0b49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.057909 4935 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a0347b99-ecda-4bd9-b81c-0fef663e0b49-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.102747 4935 generic.go:334] "Generic (PLEG): container finished" podID="a0347b99-ecda-4bd9-b81c-0fef663e0b49" containerID="cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064" exitCode=0 Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.102791 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzr4l" event={"ID":"a0347b99-ecda-4bd9-b81c-0fef663e0b49","Type":"ContainerDied","Data":"cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064"} Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.102798 4935 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wzr4l" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.102829 4935 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wzr4l" event={"ID":"a0347b99-ecda-4bd9-b81c-0fef663e0b49","Type":"ContainerDied","Data":"11a0a0863e091f57b6d3e4a8fd0744b542bb0fb4303c564ba9f3886af1add01b"} Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.102865 4935 scope.go:117] "RemoveContainer" containerID="cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.122630 4935 scope.go:117] "RemoveContainer" containerID="ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.139994 4935 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wzr4l"] Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.150552 4935 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wzr4l"] Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.162191 4935 scope.go:117] "RemoveContainer" containerID="593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.200771 4935 scope.go:117] "RemoveContainer" containerID="cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064" Dec 17 10:23:23 crc kubenswrapper[4935]: E1217 10:23:23.201149 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064\": container with ID starting with cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064 not found: ID does not exist" containerID="cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.201200 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064"} err="failed to get container status \"cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064\": rpc error: code = NotFound desc = could not find container \"cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064\": container with ID starting with cdcc0c89d06942fd257d11a0275844a4b659860de4444e6e0a8c3828fe48e064 not found: ID does not exist" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.201229 4935 scope.go:117] "RemoveContainer" containerID="ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9" Dec 17 10:23:23 crc kubenswrapper[4935]: E1217 10:23:23.201498 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9\": container with ID starting with ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9 not found: ID does not exist" containerID="ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.201539 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9"} err="failed to get container status \"ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9\": rpc error: code = NotFound desc = could not find container \"ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9\": container with ID starting with ffd8d363db6fd4ecb91dbc277c95fe68a86c152467711c269f6c50485f5b16e9 not found: ID does not exist" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.201564 4935 scope.go:117] "RemoveContainer" containerID="593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9" Dec 17 10:23:23 crc kubenswrapper[4935]: E1217 10:23:23.201771 4935 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9\": container with ID starting with 593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9 not found: ID does not exist" containerID="593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9" Dec 17 10:23:23 crc kubenswrapper[4935]: I1217 10:23:23.201798 4935 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9"} err="failed to get container status \"593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9\": rpc error: code = NotFound desc = could not find container \"593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9\": container with ID starting with 593ae3da52947d2c6f6dfc180cd8a5b7ae5f854309c39c1ad2a5bf75f20adaa9 not found: ID does not exist" Dec 17 10:23:25 crc kubenswrapper[4935]: I1217 10:23:25.132599 4935 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0347b99-ecda-4bd9-b81c-0fef663e0b49" path="/var/lib/kubelet/pods/a0347b99-ecda-4bd9-b81c-0fef663e0b49/volumes"